Skip to content

Conversation

@yanboliang
Copy link
Contributor

What changes were proposed in this pull request?

GLM supports output link prediction.

How was this patch tested?

unit test.

@SparkQA
Copy link

SparkQA commented Apr 10, 2016

Test build #55480 has finished for PR 12287 at commit db1b122.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Apr 12, 2016

Test build #55607 has finished for PR 12287 at commit e202039.

  • This patch fails to build.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Apr 12, 2016

Test build #55609 has finished for PR 12287 at commit e5aea09.

  • This patch fails MiMa tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@yanboliang
Copy link
Contributor Author

Jenkins, test this please.

@SparkQA
Copy link

SparkQA commented Apr 12, 2016

Test build #55613 has finished for PR 12287 at commit e5aea09.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

asfgit pushed a commit that referenced this pull request Apr 12, 2016
…t in SparkR:::glm

* SparkR glm supports families and link functions which match R's signature for family.
* SparkR glm API refactor. The comparative standard of the new API is R glm, so I only expose the arguments that R glm supports: ```formula, family, data, epsilon and maxit```.
* This PR is focus on glm() and predict(), summary statistics will be done in a separate PR after this get in.
* This PR depends on #12287 which make GLMs support link prediction at Scala side. After that merged, I will add more tests for predict() to this PR.

Unit tests.

cc mengxr jkbradley hhbyyh

Author: Yanbo Liang <[email protected]>

Closes #12294 from yanboliang/spark-12566.
familyAndLink.fitted(eta)
}

protected def predictLink(features: Vector): Double = {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • This could be a private method.
  • Missing doc.

@mengxr
Copy link
Contributor

mengxr commented Apr 19, 2016

Made one pass.

@SparkQA
Copy link

SparkQA commented Apr 20, 2016

Test build #56347 has finished for PR 12287 at commit cb1b5e6.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

}
super.validateAndTransformSchema(schema, fitting, featuresDataType)
val newSchema = super.validateAndTransformSchema(schema, fitting, featuresDataType)
if ($(linkPredictionCol).nonEmpty) {
Copy link
Contributor Author

@yanboliang yanboliang Apr 20, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The check $(linkPredictionCol).nonEmpty can be omitted, because we already have empty check inside the next line function SchemaUtils.appendColumn. But these kinds of code can help developers to understand the logic clearly and they exist at lots of place in the code base. If we would like to omit them, I can do the clean up in a separate PR.

@mengxr
Copy link
Contributor

mengxr commented Apr 22, 2016

LGTM. Merged into master. Thanks!

@asfgit asfgit closed this in 4e72622 Apr 22, 2016
@yanboliang yanboliang deleted the spark-14479 branch April 22, 2016 02:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants