Skip to content

Conversation

@srowen
Copy link
Member

@srowen srowen commented Feb 9, 2017

  • Move external/java8-tests tests into core, streaming, sql and remove
  • Remove MaxPermGen and related options
  • Fix some reflection / TODOs around Java 8+ methods
  • Update doc references to 1.7/1.8 differences
  • Remove Java 7/8 related build profiles
  • Update some plugins for better Java 8 compatibility
  • Fix a few Java-related warnings

For the future:

  • Update Java 8 examples to fully use Java 8
  • Update Java tests to use lambdas for simplicity
  • Update Java internal implementations to use lambdas

How was this patch tested?

Existing tests

build/mvn Outdated
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is ReservedCodeCacheSize no longer applicable to java 8?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's still in Java 8. I actually removed this because I think it's defunct and no longer needed, but I admit it's not strictly related to Java 8. I'd put it back if anyone has doubts, but if nobody can recall what it's for (I've never set it in dev or production) maybe it's removable now

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it was there to speed compilation up. Unless you want to do a controlled experiment to make sure it doesn't regress, I'd just leave it there.

@rxin
Copy link
Contributor

rxin commented Feb 9, 2017

With this, what's the behavior if users use a Java 7 runtime to run Spark? What kind of errors do we generate?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

how often is this used? if not maybe just remove this function entirely.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only about 5 usages. I'll inline it.

Copy link
Contributor

@rxin rxin Feb 9, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for api tests it is still good to use test package scope, because we wanted to make sure users can use this.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good. I was matching JavaAPISuite which isn't in test scope, but I agree with you. (I could move `JavaAPISuite' too?)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yea it'd make sense to move the JavaAPISuite as well.

@srowen
Copy link
Member Author

srowen commented Feb 9, 2017

You would immediately get an "unsupported major/minor version" error, because all of the byte code would specify 52.0 (= Java 8) and JDK 7 would reject it.

@SparkQA
Copy link

SparkQA commented Feb 9, 2017

Test build #72640 has finished for PR 16871 at commit 56864a8.

  • This patch fails to build.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 9, 2017

Test build #72644 has finished for PR 16871 at commit 43cf190.

  • This patch fails to build.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen srowen changed the title [SPARK-19493][BUILD][CORE][WIP] Remove Java 7 support [SPARK-19550][BUILD][CORE][WIP] Remove Java 7 support Feb 10, 2017
@SparkQA
Copy link

SparkQA commented Feb 10, 2017

Test build #72709 has finished for PR 16871 at commit dd51a6b.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Now, there is no $perm here.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oops, thank you. Will fix that in my next push.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is still needed, right? This is a scalac thing, not a javac thing.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this was the config for the java8-tests module that's gone now, so it's not needed?

I think that comment meant 2.10.4 but I need to check 2.10 compatibility before going much further.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the "-target" parameter to scalac, IIRC. From 2.10.6:

  -target:<target>           Target platform for object files. All JVM 1.5 targets are deprecated. (jvm-1.5,jvm-1.5-fjbg,jvm-1.5-asm,jvm-1.6,jvm-1.7,msil) default:jvm-1.6

No 1.8 there.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see, so we need scalac to target 1.7 byte code for the Scala code it emits in 2.10. That would be fine for moving to Java 8 if javac (8) were handling Java code using Java 8, because the result would just be a mix of 7 and 8 bytecode but no big deal. Do I have that right?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with the part where you say "that would be fine", not sure I follow the rest.

But yeah you'll end up with mixed 1.7 and 1.8 bytecode as far as I can see. That's fine. It just means it will take a little longer for the class version exception to be thrown in 1.7 in some cases.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, I can't tell from the latest master-compile-2.10 output: https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-compile-maven-scala-2.10/3678/consoleFull
https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-compile-maven-scala-2.10/configure

Pardon @shaneknapp but would you be able to answer whether the Spark Scala 2.10 builds actually use a Scala 2.10 binary, or something later?

If it's actually using 2.10, I suppose we could just decide to run all builds with at least Scala 2.11 even those cross-compiled for 2.10.

Copy link
Contributor

@shaneknapp shaneknapp Feb 13, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@srowen -- these builds grab the scala jar of the proper version depending on what axis is being built. look in the dev/ directory at the code in there. we don't even have scala installed anywhere on those systems by default!

i'm also CCing @JoshRosen as the build scripts have come a long way since i last poked at them. :)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My best guess is that run-tests.py is just using whatever Scala is defined by the build. How are you doing this cross-compilation, by running zinc? If that's the case, maybe disable zinc and try to build for 2.10.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @shaneknapp -- so there's no more to the tool chain here than the Scala jars, OK. @vanzin I tried disabling zinc and it still built and ran tests after switching to 2.10. I tried to emulate the job that the master-2.10 Jenkins job would run. I think we're OK on this front?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, if it's working for you, that's good enough. I guess we'll find out pretty quickly if something needs to be adjusted in the jenkins jobs.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it worth it to keep the "test." package there? That avoids having the test use package-private APIs inadvertently (if any of those exit).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah will do on a few test suites like this, to put them back in test.*

@SparkQA
Copy link

SparkQA commented Feb 11, 2017

Test build #72724 has finished for PR 16871 at commit 7f8a2cb.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

perhaps we don't want to change this - this new version of Roxygen is not on Jenkins

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I was going to ask about this. I didn't change this myself, but it got changed when I ran the build (is that to be expected?) I do have a later Roxygen installed, but I don't think anything changed that would require it, but, I don't know the implications of this either. I can just revert it of course as it's not related.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea, it'd update this line automatically, if you have R and Roxygen2 is a different version.
6.0 was just released 2 weeks ago and it seems not at all everything is backward compatible, so probably safer for now to hold on (that and might/might not match what we have on Jenkins)

@SparkQA
Copy link

SparkQA commented Feb 12, 2017

Test build #72789 has finished for PR 16871 at commit e5541cd.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 13, 2017

Test build #72816 has finished for PR 16871 at commit 79d9a08.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member Author

srowen commented Feb 13, 2017

I'll wait at least another day or two to merge, but any significant objections to this Java 7 change? note that the test / example changes will come afterwards as they're about as large, though, more straightforward.

- Move external/java8-tests tests into core, streaming, sql and remove
- Remove MaxPermGen and related options
- Fix some reflection / TODOs around Java 8+ methods
- Update doc references to 1.7/1.8 differences
- Remove Java 7/8 related build profiles
- Update some plugins for better Java 8 compatibility
- Fix a few Java-related warnings
@SparkQA
Copy link

SparkQA commented Feb 15, 2017

Test build #72896 has finished for PR 16871 at commit 736c907.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member Author

srowen commented Feb 16, 2017

Merged to master

@asfgit asfgit closed this in 0e24054 Feb 16, 2017
@srowen
Copy link
Member Author

srowen commented Feb 16, 2017

This caused the SBT build to fail because MiMa doesn't like Java 8 bytecode, but this might be simple config issue. Investigating ...

https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.6/2466/console

@srowen
Copy link
Member Author

srowen commented Feb 16, 2017

Yep, since dev/mima runs just java, it's likely picking up Java 7 from the Jenkins machines. The jobs all set JAVA_HOME, and set it to the Java 8 home. Other Spark scripts here will use $JAVA_HOME/bin/java to run java commands if that's set, and otherwise just java. See dev/check-license. I think that's the fix here too. I'll push this shortly as a hot-fix if it works for me, though, kinda a small separate issue.

ghost pushed a commit to dbtsai/spark that referenced this pull request Feb 16, 2017
…et in dev/mima

## What changes were proposed in this pull request?

Use JAVA_HOME/bin/java if JAVA_HOME is set in dev/mima script to run MiMa
This follows on apache#16871 -- it's a slightly separate issue, but, is currently causing a build failure.

## How was this patch tested?

Manually tested.

Author: Sean Owen <[email protected]>

Closes apache#16957 from srowen/SPARK-19550.2.
@srowen srowen deleted the SPARK-19493 branch February 17, 2017 09:04
@ericl
Copy link
Contributor

ericl commented Feb 17, 2017

I think this also broke scala-2.10 compilation here: https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Compile/job/spark-master-compile-sbt-scala-2.10/

Investigating.

@srowen
Copy link
Member Author

srowen commented Feb 17, 2017

Darn. The Maven master 2.10 build is fine but not SBT: [error] 'jvm-1.8' is not a valid choice for '-target'

It might work to output 1.7 bytecode from SBT just for 2.10. Maybe I'm overlooking why that wouldn't work, but, setting source/target to 1.7 for 2.10 is a good place to start. I can try that.

Another option is to ignore it, because the SBT build is a convenience build and perhaps not many people are developing against Spark 2.2 and still on Scala 2.10.

Another options is to go farther and drop 2.10 support. I wouldn't want to do that solely because of this, but if others favored dropping 2.10 support that would also make this not a problem.

It's possible to roll back but let's exhaust the other options above first.

@vanzin
Copy link
Contributor

vanzin commented Feb 17, 2017

It might work to output 1.7 bytecode from SBT just for 2.10.

This is the discussion we were having earlier (#16871 (comment)). If you copy&paste the code that was under the java8 test settings into the place where you set the scala target to 1.8, it should fix this.

@srowen
Copy link
Member Author

srowen commented Feb 18, 2017

@ericl @vanzin see #16983

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

8 participants