Skip to content

Conversation

@sarutak
Copy link
Member

@sarutak sarutak commented Nov 26, 2021

What changes were proposed in this pull request?

This PR adds a configuration to Dockerfile.java17 to set the environment variable JAVA_HOME for Java 17 installed by apt-get.

Why are the changes needed?

In entrypoint.sh, ${JAVA_HOME}/bin/java is used but the container build from Dockerfile.java17 is not set the environment variable.
As a result, executors can't launch.

+ CMD=(${JAVA_HOME}/bin/java "${SPARK_EXECUTOR_JAVA_OPTS[@]}" -Xms$SPARK_EXECUTOR_MEMORY -Xmx$SPARK_EXECUTOR_MEMORY -cp "$SPARK_CLASSPATH:$SPARK_DIST_CLASSPATH" org.apache.spark.scheduler.cluster.k8s.KubernetesExecutorBackend --driver-url $SPARK_DRIVER_URL --executor-id $SPARK_EXECUTOR_ID --cores $SPARK_EXECUTOR_CORES --app-id $SPARK_APPLICATION_ID --hostname $SPARK_EXECUTOR_POD_IP --resourceProfileId $SPARK_RESOURCE_PROFILE_ID --podName $SPARK_EXECUTOR_POD_NAME)
+ exec /usr/bin/tini -s -- /bin/java -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED -Dspark.driver.port=42295 -Xms1024m -Xmx1024m -cp '/opt/spark/conf::/opt/spark/jars/*:' org.apache.spark.scheduler.cluster.k8s.KubernetesExecutorBackend --driver-url spark://CoarseGrainedScheduler@devel-nuc:42295 --executor-id 70 --cores 1 --app-id spark-f7678047ff284f538b04fef3df44f39a --hostname 172.18.0.3 --resourceProfileId 0 --podName spark-shell-4900407d5b79fb64-exec-70
[FATAL tini (15)] exec /bin/java failed: No such file or directory

Does this PR introduce any user-facing change?

No.

How was this patch tested?

Confirmed that the following simple job can run successfully with a container image build from the modified Dockerfile.java17.

$ bin/spark-shell --master k8s://https://<host>:<port> --conf spark.kubernetes.container.image=spark:<tag>
scala> spark.range(10).show
+---+                                                                           
| id|
+---+
|  0|
|  1|
|  2|
|  3|
|  4|
|  5|
|  6|
|  7|
|  8|
|  9|
+---+

@SparkQA
Copy link

SparkQA commented Nov 26, 2021

Test build #145666 has finished for PR 34722 at commit 08a3686.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Nov 26, 2021

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/50136/

@SparkQA
Copy link

SparkQA commented Nov 26, 2021

Kubernetes integration test status failure
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/50136/

@HyukjinKwon
Copy link
Member

cc @dongjoon-hyun and @holdenk FYI

@holdenk
Copy link
Contributor

holdenk commented Nov 28, 2021

LGTM yay java17

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you, @sarutak and all!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants