-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-18326][SPARKR][ML] Review SparkR ML wrappers API for 2.1 #16169
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Test build #69725 has finished for PR 16169 at commit
|
|
How would we set Is it ok to call this after calling |
|
Generally looks good, have a question above. |
a5c3b8b to
a355dde
Compare
|
@felixcheung This is a good question. It's reasonable to call |
|
Test build #69800 has finished for PR 16169 at commit
|
R/pkg/R/mllib.R
Outdated
| } | ||
| if (!is.numeric(reg) || reg < 0) { | ||
| if (!is.numeric(regParam) || regParam < 0) { | ||
| stop("reg should be a nonnegative number.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reg -> regParam
|
I don't really see the harm in letting users specify probabilityCol beforehand, except that they may not have a good way to map the indices to String labels. I'm OK with removing it for now though. |
|
Test build #69840 has finished for PR 16169 at commit
|
|
Merged into master and branch-2.1. Thanks for all your reviewing. |
## What changes were proposed in this pull request? Reviewing SparkR ML wrappers API for 2.1 release, mainly two issues: * Remove ```probabilityCol``` from the argument list of ```spark.logit``` and ```spark.randomForest```. Since it was used when making prediction and should be an argument of ```predict```, and we will work on this at [SPARK-18618](https://issues.apache.org/jira/browse/SPARK-18618) in the next release cycle. * Fix ```spark.als``` params to make it consistent with MLlib. ## How was this patch tested? Existing tests. Author: Yanbo Liang <[email protected]> Closes #16169 from yanboliang/spark-18326. (cherry picked from commit 9725549) Signed-off-by: Yanbo Liang <[email protected]>
## What changes were proposed in this pull request? Reviewing SparkR ML wrappers API for 2.1 release, mainly two issues: * Remove ```probabilityCol``` from the argument list of ```spark.logit``` and ```spark.randomForest```. Since it was used when making prediction and should be an argument of ```predict```, and we will work on this at [SPARK-18618](https://issues.apache.org/jira/browse/SPARK-18618) in the next release cycle. * Fix ```spark.als``` params to make it consistent with MLlib. ## How was this patch tested? Existing tests. Author: Yanbo Liang <[email protected]> Closes apache#16169 from yanboliang/spark-18326.
## What changes were proposed in this pull request? Reviewing SparkR ML wrappers API for 2.1 release, mainly two issues: * Remove ```probabilityCol``` from the argument list of ```spark.logit``` and ```spark.randomForest```. Since it was used when making prediction and should be an argument of ```predict```, and we will work on this at [SPARK-18618](https://issues.apache.org/jira/browse/SPARK-18618) in the next release cycle. * Fix ```spark.als``` params to make it consistent with MLlib. ## How was this patch tested? Existing tests. Author: Yanbo Liang <[email protected]> Closes apache#16169 from yanboliang/spark-18326.
What changes were proposed in this pull request?
Reviewing SparkR ML wrappers API for 2.1 release, mainly two issues:
probabilityColfrom the argument list ofspark.logitandspark.randomForest. Since it was used when making prediction and should be an argument ofpredict, and we will work on this at SPARK-18618 in the next release cycle.spark.alsparams to make it consistent with MLlib.How was this patch tested?
Existing tests.