Skip to content

Conversation

@JoshRosen
Copy link
Contributor

Now that 1.3 has been released, we should enable MiMa checks for the sql subproject.

@SparkQA
Copy link

SparkQA commented Apr 27, 2015

Test build #722 has started for PR 5727 at commit 1aae027.

@SparkQA
Copy link

SparkQA commented Apr 27, 2015

Test build #31068 has finished for PR 5727 at commit 1aae027.

  • This patch fails MiMa tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch does not change any dependencies.

@JoshRosen
Copy link
Contributor Author

It looks like SQL failed 11 MiMa checks, although it looks like all of them are in test code or internal APIs (so we can just double-check, then add the proper excludes / annotations):

[info] spark-sql: found 13 potential binary incompatibilities (filtered 101)
[error]  * method checkAnalysis()org.apache.spark.sql.catalyst.analysis.CheckAnalysis in class org.apache.spark.sql.SQLContext does not have a correspondent in new version
[error]    filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.SQLContext.checkAnalysis")
[error]  * method children()scala.collection.immutable.Nil# in class org.apache.spark.sql.execution.ExecutedCommand has now a different result type; was: scala.collection.immutable.Nil#, is now: scala.collection.Seq
[error]    filter with: ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.execution.ExecutedCommand.children")
[error]  * class org.apache.spark.sql.execution.AddExchange does not have a correspondent in new version
[error]    filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.sql.execution.AddExchange")
[error]  * method children()scala.collection.immutable.Nil# in class org.apache.spark.sql.execution.LogicalLocalTable has now a different result type; was: scala.collection.immutable.Nil#, is now: scala.collection.Seq
[error]    filter with: ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.execution.LogicalLocalTable.children")
[error]  * method newInstance()org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation in class org.apache.spark.sql.execution.LogicalLocalTable has now a different result type; was: org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation, is now: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
[error]    filter with: ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.execution.LogicalLocalTable.newInstance")
[error]  * method children()scala.collection.immutable.Nil# in class org.apache.spark.sql.execution.PhysicalRDD has now a different result type; was: scala.collection.immutable.Nil#, is now: scala.collection.Seq
[error]    filter with: ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.execution.PhysicalRDD.children")
[error]  * method children()scala.collection.immutable.Nil# in class org.apache.spark.sql.execution.LocalTableScan has now a different result type; was: scala.collection.immutable.Nil#, is now: scala.collection.Seq
[error]    filter with: ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.execution.LocalTableScan.children")
[error]  * object org.apache.spark.sql.execution.AddExchange does not have a correspondent in new version
[error]    filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.sql.execution.AddExchange$")
[error]  * method children()scala.collection.immutable.Nil# in class org.apache.spark.sql.execution.LogicalRDD has now a different result type; was: scala.collection.immutable.Nil#, is now: scala.collection.Seq
[error]    filter with: ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.execution.LogicalRDD.children")
[error]  * method newInstance()org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation in class org.apache.spark.sql.execution.LogicalRDD has now a different result type; was: org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation, is now: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
[error]    filter with: ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.execution.LogicalRDD.newInstance")
[error]  * class org.apache.spark.sql.parquet.ParquetTestData does not have a correspondent in new version
[error]    filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.sql.parquet.ParquetTestData")
[error]  * object org.apache.spark.sql.parquet.ParquetTestData does not have a correspondent in new version
[error]    filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.sql.parquet.ParquetTestData$")
[error]  * class org.apache.spark.sql.parquet.TestGroupWriteSupport does not have a correspondent in new version
[error]    filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.sql.parquet.TestGroupWriteSupport")

launcher failed its checks because it couldn't find a spark-launcher JAR on Maven:

[info] spark-mllib: found 0 potential binary incompatibilities (filtered 242)
sbt.ResolveException: unresolved dependency: org.apache.spark#spark-launcher_2.10;1.3.0: not found
    at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:278)
    at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:175)
    at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:157)
    at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
    at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
    at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:128)
    at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:56)
    at sbt.IvySbt$$anon$4.call(Ivy.scala:64)
    at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
    at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)
    at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)
    at xsbt.boot.Using$.withResource(Using.scala:10)
    at xsbt.boot.Using$.apply(Using.scala:9)
    at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
    at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
    at xsbt.boot.Locks$.apply0(Locks.scala:31)
    at xsbt.boot.Locks$.apply(Locks.scala:28)
    at sbt.IvySbt.withDefaultLogger(Ivy.scala:64)
    at sbt.IvySbt.withIvy(Ivy.scala:123)
    at sbt.IvySbt.withIvy(Ivy.scala:120)
    at sbt.IvySbt$Module.withModule(Ivy.scala:151)
    at sbt.IvyActions$.updateEither(IvyActions.scala:157)
    at sbt.IvyActions$.update(IvyActions.scala:145)
    at com.typesafe.tools.mima.plugin.SbtMima$.getPreviousArtifact(SbtMima.scala:75)
    at com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaDefaultSettings$4$$anonfun$apply$2.apply(MimaPlugin.scala:30)
    at com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaDefaultSettings$4$$anonfun$apply$2.apply(MimaPlugin.scala:30)
    at scala.Option.map(Option.scala:145)
    at com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaDefaultSettings$4.apply(MimaPlugin.scala:30)
    at com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaDefaultSettings$4.apply(MimaPlugin.scala:29)
    at scala.Function3$$anonfun$tupled$1.apply(Function3.scala:35)
    at scala.Function3$$anonfun$tupled$1.apply(Function3.scala:34)
    at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
    at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
    at sbt.std.Transform$$anon$4.work(System.scala:63)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
    at sbt.Execute.work(Execute.scala:235)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
    at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
java.lang.RuntimeException: spark-sql: Binary compatibility check failed!
    at scala.sys.package$.error(package.scala:27)
    at com.typesafe.tools.mima.plugin.SbtMima$.reportErrors(SbtMima.scala:64)
    at com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaReportSettings$3.apply(MimaPlugin.scala:23)
    at com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaReportSettings$3.apply(MimaPlugin.scala:23)
    at scala.Function5$$anonfun$tupled$1.apply(Function5.scala:35)
    at scala.Function5$$anonfun$tupled$1.apply(Function5.scala:34)
    at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
    at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
    at sbt.std.Transform$$anon$4.work(System.scala:63)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
    at sbt.Execute.work(Execute.scala:235)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
    at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
[error] (launcher/*:mimaPreviousClassfiles) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-launcher_2.10;1.3.0: not found

@JoshRosen
Copy link
Contributor Author

Ah, launcher was only added in 1.4, so I'll put the MiMa exclude back.

@JoshRosen JoshRosen changed the title [Build] Enable MiMa checks for launcher and sql projects [Build] Enable MiMa checks for SQL Apr 27, 2015
@SparkQA
Copy link

SparkQA commented Apr 28, 2015

Test build #31072 has finished for PR 5727 at commit e276cee.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch does not change any dependencies.

@JoshRosen
Copy link
Contributor Author

/cc @marmbrus

@SparkQA
Copy link

SparkQA commented Apr 28, 2015

Test build #31118 has finished for PR 5727 at commit 0c48e4d.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • trait LDAOptimizer
    • class EMLDAOptimizer extends LDAOptimizer
  • This patch does not change any dependencies.

@rxin
Copy link
Contributor

rxin commented Apr 30, 2015

LGTM.

…ecks

Conflicts:
	project/MimaExcludes.scala
	project/SparkBuild.scala
@SparkQA
Copy link

SparkQA commented Apr 30, 2015

Test build #31454 has finished for PR 5727 at commit 3ad302b.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch does not change any dependencies.

@asfgit asfgit closed this in fa01bec Apr 30, 2015
@JoshRosen JoshRosen deleted the enable-more-mima-checks branch April 30, 2015 23:32
jeanlyn pushed a commit to jeanlyn/spark that referenced this pull request May 28, 2015
Now that 1.3 has been released, we should enable MiMa checks for the `sql` subproject.

Author: Josh Rosen <[email protected]>

Closes apache#5727 from JoshRosen/enable-more-mima-checks and squashes the following commits:

3ad302b [Josh Rosen] Merge remote-tracking branch 'origin/master' into enable-more-mima-checks
0c48e4d [Josh Rosen] Merge remote-tracking branch 'origin/master' into enable-more-mima-checks
e276cee [Josh Rosen] Fix SQL MiMa checks via excludes and private[sql]
44d0d01 [Josh Rosen] Add back 'launcher' exclude
1aae027 [Josh Rosen] Enable MiMa checks for launcher and sql projects.
jeanlyn pushed a commit to jeanlyn/spark that referenced this pull request Jun 12, 2015
Now that 1.3 has been released, we should enable MiMa checks for the `sql` subproject.

Author: Josh Rosen <[email protected]>

Closes apache#5727 from JoshRosen/enable-more-mima-checks and squashes the following commits:

3ad302b [Josh Rosen] Merge remote-tracking branch 'origin/master' into enable-more-mima-checks
0c48e4d [Josh Rosen] Merge remote-tracking branch 'origin/master' into enable-more-mima-checks
e276cee [Josh Rosen] Fix SQL MiMa checks via excludes and private[sql]
44d0d01 [Josh Rosen] Add back 'launcher' exclude
1aae027 [Josh Rosen] Enable MiMa checks for launcher and sql projects.
nemccarthy pushed a commit to nemccarthy/spark that referenced this pull request Jun 19, 2015
Now that 1.3 has been released, we should enable MiMa checks for the `sql` subproject.

Author: Josh Rosen <[email protected]>

Closes apache#5727 from JoshRosen/enable-more-mima-checks and squashes the following commits:

3ad302b [Josh Rosen] Merge remote-tracking branch 'origin/master' into enable-more-mima-checks
0c48e4d [Josh Rosen] Merge remote-tracking branch 'origin/master' into enable-more-mima-checks
e276cee [Josh Rosen] Fix SQL MiMa checks via excludes and private[sql]
44d0d01 [Josh Rosen] Add back 'launcher' exclude
1aae027 [Josh Rosen] Enable MiMa checks for launcher and sql projects.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants