@@ -31,16 +31,16 @@ For example, if the registry host is `registry-host` and the registry is listeni
3131Kubernetes applications can be executed via ` spark-submit ` . For example, to compute the value of pi, assuming the images
3232are set up as described above:
3333
34- bin/spark-submit
35- --deploy-mode cluster
36- --class org.apache.spark.examples.SparkPi
37- --master k8s://https://<k8s-apiserver-host>:<k8s-apiserver-port>
38- --kubernetes-namespace default
39- --conf spark.executor.instances=5
40- --conf spark.app.name=spark-pi
41- --conf spark.kubernetes.driver.docker.image=registry-host:5000/spark-driver:latest
42- --conf spark.kubernetes.executor.docker.image=registry-host:5000/spark-executor:latest
43- examples/jars/spark_2 .11-2.2.0.jar
34+ bin/spark-submit \
35+ --deploy-mode cluster \
36+ --class org.apache.spark.examples.SparkPi \
37+ --master k8s://https://<k8s-apiserver-host>:<k8s-apiserver-port> \
38+ --kubernetes-namespace default \
39+ --conf spark.executor.instances=5 \
40+ --conf spark.app.name=spark-pi \
41+ --conf spark.kubernetes.driver.docker.image=registry-host:5000/spark-driver:latest \
42+ --conf spark.kubernetes.executor.docker.image=registry-host:5000/spark-executor:latest \
43+ examples/jars/spark_examples_2 .11-2.2.0.jar
4444
4545<!-- TODO master should default to https if no scheme is specified -->
4646The Spark master, specified either via passing the ` --master ` command line argument to ` spark-submit ` or by setting
@@ -75,53 +75,53 @@ examples of providing application dependencies.
7575
7676To submit an application with both the main resource and two other jars living on the submitting user's machine:
7777
78- bin/spark-submit
79- --deploy-mode cluster
80- --class com.example.applications.SampleApplication
81- --master k8s://https://192.168.99.100
82- --kubernetes-namespace default
83- --upload-jars /home/exampleuser/exampleapplication/dep1.jar,/home/exampleuser/exampleapplication/dep2.jar
84- --conf spark.kubernetes.driver.docker.image=registry-host:5000/spark-driver:latest
85- --conf spark.kubernetes.executor.docker.image=registry-host:5000/spark-executor:latest
78+ bin/spark-submit \
79+ --deploy-mode cluster \
80+ --class com.example.applications.SampleApplication \
81+ --master k8s://https://192.168.99.100 \
82+ --kubernetes-namespace default \
83+ --upload-jars /home/exampleuser/exampleapplication/dep1.jar,/home/exampleuser/exampleapplication/dep2.jar \
84+ --conf spark.kubernetes.driver.docker.image=registry-host:5000/spark-driver:latest \
85+ --conf spark.kubernetes.executor.docker.image=registry-host:5000/spark-executor:latest \
8686 /home/exampleuser/exampleapplication/main.jar
8787
8888Note that since passing the jars through the ` --upload-jars ` command line argument is equivalent to setting the
8989` spark.kubernetes.driver.uploads.jars ` Spark property, the above will behave identically to this command:
9090
91- bin/spark-submit
92- --deploy-mode cluster
93- --class com.example.applications.SampleApplication
94- --master k8s://https://192.168.99.100
95- --kubernetes-namespace default
96- --conf spark.kubernetes.driver.uploads.jars=/home/exampleuser/exampleapplication/dep1.jar,/home/exampleuser/exampleapplication/dep2.jar
97- --conf spark.kubernetes.driver.docker.image=registry-host:5000/spark-driver:latest
98- --conf spark.kubernetes.executor.docker.image=registry-host:5000/spark-executor:latest
91+ bin/spark-submit \
92+ --deploy-mode cluster \
93+ --class com.example.applications.SampleApplication \
94+ --master k8s://https://192.168.99.100 \
95+ --kubernetes-namespace default \
96+ --conf spark.kubernetes.driver.uploads.jars=/home/exampleuser/exampleapplication/dep1.jar,/home/exampleuser/exampleapplication/dep2.jar \
97+ --conf spark.kubernetes.driver.docker.image=registry-host:5000/spark-driver:latest \
98+ --conf spark.kubernetes.executor.docker.image=registry-host:5000/spark-executor:latest \
9999 /home/exampleuser/exampleapplication/main.jar
100100
101101To specify a main application resource that can be downloaded from an HTTP service, and if a plugin for that application
102102is located in the jar ` /opt/spark-plugins/app-plugin.jar ` on the docker image's disk:
103103
104- bin/spark-submit
105- --deploy-mode cluster
106- --class com.example.applications.PluggableApplication
107- --master k8s://https://192.168.99.100
108- --kubernetes-namespace default
109- --jars /opt/spark-plugins/app-plugin.jar
110- --conf spark.kubernetes.driver.docker.image=registry-host:5000/spark-driver-custom:latest
111- --conf spark.kubernetes.executor.docker.image=registry-host:5000/spark-executor:latest
104+ bin/spark-submit \
105+ --deploy-mode cluster \
106+ --class com.example.applications.PluggableApplication \
107+ --master k8s://https://192.168.99.100 \
108+ --kubernetes-namespace default \
109+ --jars /opt/spark-plugins/app-plugin.jar \
110+ --conf spark.kubernetes.driver.docker.image=registry-host:5000/spark-driver-custom:latest \
111+ --conf spark.kubernetes.executor.docker.image=registry-host:5000/spark-executor:latest \
112112 http://example.com:8080/applications/sparkpluggable/app.jar
113113
114114Note that since passing the jars through the ` --jars ` command line argument is equivalent to setting the ` spark.jars `
115115Spark property, the above will behave identically to this command:
116116
117- bin/spark-submit
118- --deploy-mode cluster
119- --class com.example.applications.PluggableApplication
120- --master k8s://https://192.168.99.100
121- --kubernetes-namespace default
122- --conf spark.jars=file:///opt/spark-plugins/app-plugin.jar
123- --conf spark.kubernetes.driver.docker.image=registry-host:5000/spark-driver-custom:latest
124- --conf spark.kubernetes.executor.docker.image=registry-host:5000/spark-executor:latest
117+ bin/spark-submit \
118+ --deploy-mode cluster \
119+ --class com.example.applications.PluggableApplication \
120+ --master k8s://https://192.168.99.100 \
121+ --kubernetes-namespace default \
122+ --conf spark.jars=file:///opt/spark-plugins/app-plugin.jar \
123+ --conf spark.kubernetes.driver.docker.image=registry-host:5000/spark-driver-custom:latest \
124+ --conf spark.kubernetes.executor.docker.image=registry-host:5000/spark-executor:latest \
125125 http://example.com:8080/applications/sparkpluggable/app.jar
126126
127127### Spark Properties
0 commit comments