Skip to content

Commit a3886ba

Browse files
sarutakdongjoon-hyun
authored andcommitted
[SPARK-37319][K8S][FOLLOWUP] Set JAVA_HOME for Java 17 installed by apt-get
### What changes were proposed in this pull request? This PR adds a configuration to `Dockerfile.java17` to set the environment variable `JAVA_HOME` for Java 17 installed by apt-get. ### Why are the changes needed? In `entrypoint.sh`, `${JAVA_HOME}/bin/java` is used but the container build from `Dockerfile.java17` is not set the environment variable. As a result, executors can't launch. ``` + CMD=(${JAVA_HOME}/bin/java "${SPARK_EXECUTOR_JAVA_OPTS[]}" -Xms$SPARK_EXECUTOR_MEMORY -Xmx$SPARK_EXECUTOR_MEMORY -cp "$SPARK_CLASSPATH:$SPARK_DIST_CLASSPATH" org.apache.spark.scheduler.cluster.k8s.KubernetesExecutorBackend --driver-url $SPARK_DRIVER_URL --executor-id $SPARK_EXECUTOR_ID --cores $SPARK_EXECUTOR_CORES --app-id $SPARK_APPLICATION_ID --hostname $SPARK_EXECUTOR_POD_IP --resourceProfileId $SPARK_RESOURCE_PROFILE_ID --podName $SPARK_EXECUTOR_POD_NAME) + exec /usr/bin/tini -s -- /bin/java -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED -Dspark.driver.port=42295 -Xms1024m -Xmx1024m -cp '/opt/spark/conf::/opt/spark/jars/*:' org.apache.spark.scheduler.cluster.k8s.KubernetesExecutorBackend --driver-url spark://CoarseGrainedSchedulerdevel-nuc:42295 --executor-id 70 --cores 1 --app-id spark-f7678047ff284f538b04fef3df44f39a --hostname 172.18.0.3 --resourceProfileId 0 --podName spark-shell-4900407d5b79fb64-exec-70 [FATAL tini (15)] exec /bin/java failed: No such file or directory ``` ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Confirmed that the following simple job can run successfully with a container image build from the modified `Dockerfile.java17`. ``` $ bin/spark-shell --master k8s://https://<host>:<port> --conf spark.kubernetes.container.image=spark:<tag> scala> spark.range(10).show +---+ | id| +---+ | 0| | 1| | 2| | 3| | 4| | 5| | 6| | 7| | 8| | 9| +---+ ``` Closes #34722 from sarutak/java17-home-kube. Authored-by: Kousuke Saruta <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
1 parent e91ef19 commit a3886ba

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile.java17

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -51,6 +51,7 @@ COPY kubernetes/tests /opt/spark/tests
5151
COPY data /opt/spark/data
5252

5353
ENV SPARK_HOME /opt/spark
54+
ENV JAVA_HOME /usr/lib/jvm/java-17-openjdk-amd64/
5455

5556
WORKDIR /opt/spark/work-dir
5657
RUN chmod g+w /opt/spark/work-dir

0 commit comments

Comments
 (0)