Skip to content

Conversation

@dongjoon-hyun
Copy link
Member

@dongjoon-hyun dongjoon-hyun commented Nov 21, 2020

What changes were proposed in this pull request?

This PR aims to update the test libraries.

  • ScalaTest: 3.2.0 -> 3.2.3
  • JUnit: 4.12 -> 4.13.1
  • Mockito: 3.1.0 -> 3.4.6
  • JMock: 2.8.4 -> 2.12.0
  • maven-surefire-plugin: 3.0.0-M3 -> 3.0.0-M5
  • scala-maven-plugin: 4.3.0 -> 4.4.0

Why are the changes needed?

This will make the test frameworks up-to-date for Apache Spark 3.1.0.

Does this PR introduce any user-facing change?

No.

How was this patch tested?

Pass the CIs.

@github-actions github-actions bot added the BUILD label Nov 21, 2020
@SparkQA
Copy link

SparkQA commented Nov 21, 2020

Test build #131482 has started for PR 30456 at commit 1328291.

@SparkQA
Copy link

SparkQA commented Nov 21, 2020

Test build #131483 has started for PR 30456 at commit e1646e1.

@SparkQA
Copy link

SparkQA commented Nov 21, 2020

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/36088/

@maropu
Copy link
Member

maropu commented Nov 22, 2020

retest this please

@maropu
Copy link
Member

maropu commented Nov 22, 2020

Mockito: 3.1.0 -> 3.5.15

not 3.6.0 but 3.5.15?

@SparkQA
Copy link

SparkQA commented Nov 22, 2020

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/36089/

@SparkQA
Copy link

SparkQA commented Nov 22, 2020

Kubernetes integration test status success
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/36089/

@SparkQA
Copy link

SparkQA commented Nov 22, 2020

Test build #131484 has finished for PR 30456 at commit e1646e1.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member Author

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for review, @maropu and @srowen .

@maropu . Yes. They are released on the same day (Oct 19, 2020) recently. 3.5.15 is stabler.

not 3.6.0 but 3.5.15?

@srowen . It's due to mockito-core which upgraded its objenesis dependency to 3.1. (Aug, 2020). To avoid the change on objenesis, we can use mockito-core 3.4.6.
https://mvnrepository.com/artifact/org.mockito/mockito-core/3.5.0

@srowen
Copy link
Member

srowen commented Nov 22, 2020

Yeah but why does mockito affect the non-test dependencies?

@dongjoon-hyun
Copy link
Member Author

Ah, good question. I missed your point of the question. Let me check it.

@dongjoon-hyun
Copy link
Member Author

@srowen . The root cause is Twitter Chill. It added a compile dependency for objenesis first and it's upgraded to 3.1.0 due to the test dependency.

[INFO] +- com.twitter:chill_2.12:jar:0.9.5:compile
[INFO] |  \- com.esotericsoftware:kryo-shaded:jar:4.0.2:compile
[INFO] |     +- com.esotericsoftware:minlog:jar:1.3.0:compile
[INFO] |     \- org.objenesis:objenesis:jar:2.5.1:compile

@dongjoon-hyun
Copy link
Member Author

I switched to Mockito 3.4.6 to avoid any dependency change in this PR.

@srowen
Copy link
Member

srowen commented Nov 22, 2020

OK. Huh weird that it can change the compile-time classpath if it's a test dependency!

@SparkQA
Copy link

SparkQA commented Nov 22, 2020

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/36107/

@SparkQA
Copy link

SparkQA commented Nov 22, 2020

Kubernetes integration test status failure
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/36107/

@dongjoon-hyun
Copy link
Member Author

dongjoon-hyun commented Nov 22, 2020

The Jenkins is already running SparkR tests here.

The last commit is an empty commit to make it sure in GitHub Action.

@SparkQA
Copy link

SparkQA commented Nov 22, 2020

Test build #131503 has finished for PR 30456 at commit befd6cf.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Nov 22, 2020

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/36114/

@SparkQA
Copy link

SparkQA commented Nov 22, 2020

Kubernetes integration test status failure
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/36114/

@SparkQA
Copy link

SparkQA commented Nov 22, 2020

Test build #131510 has finished for PR 30456 at commit 687d122.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@dongjoon-hyun
Copy link
Member Author

Could you review this again, @srowen and @maropu ?

@dongjoon-hyun
Copy link
Member Author

Could you review this, @viirya ?

Copy link
Member

@viirya viirya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These changes are only for test dependencies and plugins. Looks okay.

@dongjoon-hyun
Copy link
Member Author

Thank you so much, @viirya !
Merged to master for Apache Spark 3.1.

@dongjoon-hyun dongjoon-hyun deleted the SPARK-33512 branch November 23, 2020 00:41
HyukjinKwon added a commit that referenced this pull request Jan 5, 2021
### What changes were proposed in this pull request?

This PR is a partial revert of #30456 by downgrading scala-maven-plugin from 4.4.0 to 4.3.0.

Currently, when you run the docker release script (`./dev/create-release/do-release-docker.sh`), it fails to compile as below during incremental compilation with zinc for an unknown reason:

```
[INFO] Compiling 21 Scala sources and 3 Java sources to /opt/spark-rm/output/spark-3.1.0-bin-hadoop2.7/resource-managers/yarn/target/scala-2.12/test-classes ...
[ERROR] ## Exception when compiling 24 sources to /opt/spark-rm/output/spark-3.1.0-bin-hadoop2.7/resource-managers/yarn/target/scala-2.12/test-classes
java.lang.SecurityException: class "javax.servlet.SessionCookieConfig"'s signer information does not match signer information of other classes in the same package
java.lang.ClassLoader.checkCerts(ClassLoader.java:891)
java.lang.ClassLoader.preDefineClass(ClassLoader.java:661)
java.lang.ClassLoader.defineClass(ClassLoader.java:754)
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
java.net.URLClassLoader.access$100(URLClassLoader.java:74)
java.net.URLClassLoader$1.run(URLClassLoader.java:369)
java.net.URLClassLoader$1.run(URLClassLoader.java:363)
java.security.AccessController.doPrivileged(Native Method)
java.net.URLClassLoader.findClass(URLClassLoader.java:362)
java.lang.ClassLoader.loadClass(ClassLoader.java:418)
java.lang.ClassLoader.loadClass(ClassLoader.java:351)
java.lang.Class.getDeclaredMethods0(Native Method)
java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
java.lang.Class.privateGetPublicMethods(Class.java:2902)
java.lang.Class.getMethods(Class.java:1615)
sbt.internal.inc.ClassToAPI$.toDefinitions0(ClassToAPI.scala:170)
sbt.internal.inc.ClassToAPI$.$anonfun$toDefinitions$1(ClassToAPI.scala:123)
scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:86)
sbt.internal.inc.ClassToAPI$.toDefinitions(ClassToAPI.scala:123)
sbt.internal.inc.ClassToAPI$.$anonfun$process$1(ClassToAPI.scala:3
```

This happens when it builds Spark with Hadoop 2. It doesn't reproduce when you build this alone. It should follow the sequence of build in the release script.

This is fixed by downgrading. Looks like there is a regression in scala-maven-plugin somewhere between 4.4.0 and 4.3.0.

### Why are the changes needed?

To unblock the release.

### Does this PR introduce _any_ user-facing change?

No, dev-only.

### How was this patch tested?

It can be tested as below:

```bash
./dev/create-release/do-release-docker.sh -d $WORKING_DIR
```

Closes #31031 from HyukjinKwon/SPARK-34007.

Authored-by: HyukjinKwon <[email protected]>
Signed-off-by: HyukjinKwon <[email protected]>
HyukjinKwon added a commit that referenced this pull request Jan 5, 2021
### What changes were proposed in this pull request?

This PR is a partial revert of #30456 by downgrading scala-maven-plugin from 4.4.0 to 4.3.0.

Currently, when you run the docker release script (`./dev/create-release/do-release-docker.sh`), it fails to compile as below during incremental compilation with zinc for an unknown reason:

```
[INFO] Compiling 21 Scala sources and 3 Java sources to /opt/spark-rm/output/spark-3.1.0-bin-hadoop2.7/resource-managers/yarn/target/scala-2.12/test-classes ...
[ERROR] ## Exception when compiling 24 sources to /opt/spark-rm/output/spark-3.1.0-bin-hadoop2.7/resource-managers/yarn/target/scala-2.12/test-classes
java.lang.SecurityException: class "javax.servlet.SessionCookieConfig"'s signer information does not match signer information of other classes in the same package
java.lang.ClassLoader.checkCerts(ClassLoader.java:891)
java.lang.ClassLoader.preDefineClass(ClassLoader.java:661)
java.lang.ClassLoader.defineClass(ClassLoader.java:754)
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
java.net.URLClassLoader.access$100(URLClassLoader.java:74)
java.net.URLClassLoader$1.run(URLClassLoader.java:369)
java.net.URLClassLoader$1.run(URLClassLoader.java:363)
java.security.AccessController.doPrivileged(Native Method)
java.net.URLClassLoader.findClass(URLClassLoader.java:362)
java.lang.ClassLoader.loadClass(ClassLoader.java:418)
java.lang.ClassLoader.loadClass(ClassLoader.java:351)
java.lang.Class.getDeclaredMethods0(Native Method)
java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
java.lang.Class.privateGetPublicMethods(Class.java:2902)
java.lang.Class.getMethods(Class.java:1615)
sbt.internal.inc.ClassToAPI$.toDefinitions0(ClassToAPI.scala:170)
sbt.internal.inc.ClassToAPI$.$anonfun$toDefinitions$1(ClassToAPI.scala:123)
scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:86)
sbt.internal.inc.ClassToAPI$.toDefinitions(ClassToAPI.scala:123)
sbt.internal.inc.ClassToAPI$.$anonfun$process$1(ClassToAPI.scala:3
```

This happens when it builds Spark with Hadoop 2. It doesn't reproduce when you build this alone. It should follow the sequence of build in the release script.

This is fixed by downgrading. Looks like there is a regression in scala-maven-plugin somewhere between 4.4.0 and 4.3.0.

### Why are the changes needed?

To unblock the release.

### Does this PR introduce _any_ user-facing change?

No, dev-only.

### How was this patch tested?

It can be tested as below:

```bash
./dev/create-release/do-release-docker.sh -d $WORKING_DIR
```

Closes #31031 from HyukjinKwon/SPARK-34007.

Authored-by: HyukjinKwon <[email protected]>
Signed-off-by: HyukjinKwon <[email protected]>
(cherry picked from commit 356fdc9)
Signed-off-by: HyukjinKwon <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants