-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-19550][BUILD][CORE][WIP] Remove Java 7 support #16871
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
build/mvn
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is ReservedCodeCacheSize no longer applicable to java 8?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's still in Java 8. I actually removed this because I think it's defunct and no longer needed, but I admit it's not strictly related to Java 8. I'd put it back if anyone has doubts, but if nobody can recall what it's for (I've never set it in dev or production) maybe it's removable now
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it was there to speed compilation up. Unless you want to do a controlled experiment to make sure it doesn't regress, I'd just leave it there.
|
With this, what's the behavior if users use a Java 7 runtime to run Spark? What kind of errors do we generate? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how often is this used? if not maybe just remove this function entirely.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only about 5 usages. I'll inline it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for api tests it is still good to use test package scope, because we wanted to make sure users can use this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good. I was matching JavaAPISuite which isn't in test scope, but I agree with you. (I could move `JavaAPISuite' too?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yea it'd make sense to move the JavaAPISuite as well.
|
You would immediately get an "unsupported major/minor version" error, because all of the byte code would specify 52.0 (= Java 8) and JDK 7 would reject it. |
|
Test build #72640 has finished for PR 16871 at commit
|
|
Test build #72644 has finished for PR 16871 at commit
|
|
Test build #72709 has finished for PR 16871 at commit
|
build/sbt-launch-lib.bash
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now, there is no $perm here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oops, thank you. Will fix that in my next push.
project/SparkBuild.scala
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is still needed, right? This is a scalac thing, not a javac thing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this was the config for the java8-tests module that's gone now, so it's not needed?
I think that comment meant 2.10.4 but I need to check 2.10 compatibility before going much further.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the "-target" parameter to scalac, IIRC. From 2.10.6:
-target:<target> Target platform for object files. All JVM 1.5 targets are deprecated. (jvm-1.5,jvm-1.5-fjbg,jvm-1.5-asm,jvm-1.6,jvm-1.7,msil) default:jvm-1.6
No 1.8 there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see, so we need scalac to target 1.7 byte code for the Scala code it emits in 2.10. That would be fine for moving to Java 8 if javac (8) were handling Java code using Java 8, because the result would just be a mix of 7 and 8 bytecode but no big deal. Do I have that right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree with the part where you say "that would be fine", not sure I follow the rest.
But yeah you'll end up with mixed 1.7 and 1.8 bytecode as far as I can see. That's fine. It just means it will take a little longer for the class version exception to be thrown in 1.7 in some cases.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hm, I can't tell from the latest master-compile-2.10 output: https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-compile-maven-scala-2.10/3678/consoleFull
https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-compile-maven-scala-2.10/configure
Pardon @shaneknapp but would you be able to answer whether the Spark Scala 2.10 builds actually use a Scala 2.10 binary, or something later?
If it's actually using 2.10, I suppose we could just decide to run all builds with at least Scala 2.11 even those cross-compiled for 2.10.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@srowen -- these builds grab the scala jar of the proper version depending on what axis is being built. look in the dev/ directory at the code in there. we don't even have scala installed anywhere on those systems by default!
i'm also CCing @JoshRosen as the build scripts have come a long way since i last poked at them. :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My best guess is that run-tests.py is just using whatever Scala is defined by the build. How are you doing this cross-compilation, by running zinc? If that's the case, maybe disable zinc and try to build for 2.10.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @shaneknapp -- so there's no more to the tool chain here than the Scala jars, OK. @vanzin I tried disabling zinc and it still built and ran tests after switching to 2.10. I tried to emulate the job that the master-2.10 Jenkins job would run. I think we're OK on this front?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, if it's working for you, that's good enough. I guess we'll find out pretty quickly if something needs to be adjusted in the jenkins jobs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it worth it to keep the "test." package there? That avoids having the test use package-private APIs inadvertently (if any of those exit).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah will do on a few test suites like this, to put them back in test.*
|
Test build #72724 has finished for PR 16871 at commit
|
R/pkg/DESCRIPTION
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
perhaps we don't want to change this - this new version of Roxygen is not on Jenkins
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah I was going to ask about this. I didn't change this myself, but it got changed when I ran the build (is that to be expected?) I do have a later Roxygen installed, but I don't think anything changed that would require it, but, I don't know the implications of this either. I can just revert it of course as it's not related.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea, it'd update this line automatically, if you have R and Roxygen2 is a different version.
6.0 was just released 2 weeks ago and it seems not at all everything is backward compatible, so probably safer for now to hold on (that and might/might not match what we have on Jenkins)
|
Test build #72789 has finished for PR 16871 at commit
|
|
Test build #72816 has finished for PR 16871 at commit
|
|
I'll wait at least another day or two to merge, but any significant objections to this Java 7 change? note that the test / example changes will come afterwards as they're about as large, though, more straightforward. |
- Move external/java8-tests tests into core, streaming, sql and remove - Remove MaxPermGen and related options - Fix some reflection / TODOs around Java 8+ methods - Update doc references to 1.7/1.8 differences - Remove Java 7/8 related build profiles - Update some plugins for better Java 8 compatibility - Fix a few Java-related warnings
|
Test build #72896 has finished for PR 16871 at commit
|
|
Merged to master |
|
This caused the SBT build to fail because MiMa doesn't like Java 8 bytecode, but this might be simple config issue. Investigating ... |
|
Yep, since |
…et in dev/mima ## What changes were proposed in this pull request? Use JAVA_HOME/bin/java if JAVA_HOME is set in dev/mima script to run MiMa This follows on apache#16871 -- it's a slightly separate issue, but, is currently causing a build failure. ## How was this patch tested? Manually tested. Author: Sean Owen <[email protected]> Closes apache#16957 from srowen/SPARK-19550.2.
|
I think this also broke scala-2.10 compilation here: https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Compile/job/spark-master-compile-sbt-scala-2.10/ Investigating. |
|
Darn. The Maven master 2.10 build is fine but not SBT: It might work to output 1.7 bytecode from SBT just for 2.10. Maybe I'm overlooking why that wouldn't work, but, setting source/target to 1.7 for 2.10 is a good place to start. I can try that. Another option is to ignore it, because the SBT build is a convenience build and perhaps not many people are developing against Spark 2.2 and still on Scala 2.10. Another options is to go farther and drop 2.10 support. I wouldn't want to do that solely because of this, but if others favored dropping 2.10 support that would also make this not a problem. It's possible to roll back but let's exhaust the other options above first. |
This is the discussion we were having earlier (#16871 (comment)). If you copy&paste the code that was under the java8 test settings into the place where you set the scala target to 1.8, it should fix this. |
For the future:
How was this patch tested?
Existing tests