Skip to content

Commit f7ba952

Browse files
sarutakHyukjinKwon
authored andcommitted
[SPARK-33048][BUILD] Fix SparkBuild.scala to recognize build settings for Scala 2.13
### What changes were proposed in this pull request? This PR fixes `SparkBuild.scala` to recognize build settings for Scala 2.13. In `SparkBuild.scala`, a variable `scalaBinaryVersion` is hardcoded as `2.12`. So, an environment variable `SPARK_SCALA_VERSION` is also to be `2.12`. This issue causes some test suites (e.g. `SparkSubmitSuite`) to be error. ``` ===== TEST OUTPUT FOR o.a.s.deploy.SparkSubmitSuite: 'user classpath first in driver' ===== 20/10/02 08:55:30.234 redirect stderr for command /home/kou/work/oss/spark-scala-2.13/bin/spark-submit INFO Utils: Error: Could not find or load m ain class org.apache.spark.launcher.Main 20/10/02 08:55:30.235 redirect stderr for command /home/kou/work/oss/spark-scala-2.13/bin/spark-submit INFO Utils: /home/kou/work/oss/spark-scala- 2.13/bin/spark-class: line 96: CMD: bad array subscript ``` The reason of this error is that environment variables `SPARK_JARS_DIR` and `LAUNCH_CLASSPATH` is defined in `bin/spark-class` as follows. ``` SPARK_JARS_DIR="${SPARK_HOME}/assembly/target/scala-$SPARK_SCALA_VERSION/jars" LAUNCH_CLASSPATH="${SPARK_HOME}/launcher/target/scala-$SPARK_SCALA_VERSION/classes:$LAUNCH_CLASSPATH" ``` ### Why are the changes needed? To build for Scala 2.13 successfully. ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Tests for `core` module finish successfully. ``` build/sbt -Pscala-2.13 clean "core/test" ``` Closes #29927 from sarutak/fix-sparkbuild-for-scala-2.13. Authored-by: Kousuke Saruta <[email protected]> Signed-off-by: HyukjinKwon <[email protected]>
1 parent b205be5 commit f7ba952

File tree

1 file changed

+1
-27
lines changed

1 file changed

+1
-27
lines changed

project/SparkBuild.scala

Lines changed: 1 addition & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -94,21 +94,6 @@ object SparkBuild extends PomBuild {
9494
case Some(v) =>
9595
v.split("(\\s+|,)").filterNot(_.isEmpty).map(_.trim.replaceAll("-P", "")).toSeq
9696
}
97-
98-
// TODO: revisit for Scala 2.13 support
99-
/*
100-
Option(System.getProperty("scala.version"))
101-
.filter(_.startsWith("2.11"))
102-
.foreach { versionString =>
103-
System.setProperty("scala-2.11", "true")
104-
}
105-
if (System.getProperty("scala-2.11") == "") {
106-
// To activate scala-2.10 profile, replace empty property value to non-empty value
107-
// in the same way as Maven which handles -Dname as -Dname=true before executes build process.
108-
// see: https://github.com/apache/maven/blob/maven-3.0.4/maven-embedder/src/main/java/org/apache/maven/cli/MavenCli.java#L1082
109-
System.setProperty("scala-2.11", "true")
110-
}
111-
*/
11297
profiles
11398
}
11499

@@ -965,17 +950,6 @@ object CopyDependencies {
965950

966951
object TestSettings {
967952
import BuildCommons._
968-
969-
// TODO revisit for Scala 2.13 support
970-
private val scalaBinaryVersion = "2.12"
971-
/*
972-
if (System.getProperty("scala-2.11") == "true") {
973-
"2.11"
974-
} else {
975-
"2.12"
976-
}
977-
*/
978-
979953
private val defaultExcludedTags = Seq("org.apache.spark.tags.ChromeUITest")
980954

981955
lazy val settings = Seq (
@@ -988,7 +962,7 @@ object TestSettings {
988962
(fullClasspath in Test).value.files.map(_.getAbsolutePath)
989963
.mkString(File.pathSeparator).stripSuffix(File.pathSeparator),
990964
"SPARK_PREPEND_CLASSES" -> "1",
991-
"SPARK_SCALA_VERSION" -> scalaBinaryVersion,
965+
"SPARK_SCALA_VERSION" -> scalaBinaryVersion.value,
992966
"SPARK_TESTING" -> "1",
993967
"JAVA_HOME" -> sys.env.get("JAVA_HOME").getOrElse(sys.props("java.home"))),
994968
javaOptions in Test += s"-Djava.io.tmpdir=$testTempDir",

0 commit comments

Comments
 (0)