Skip to content

Commit 3c941dc

Browse files
committed
rename driver-py and executor-py to spark-driver-py and
spark-executor-py
1 parent c3b8dc8 commit 3c941dc

File tree

1 file changed

+7
-7
lines changed

1 file changed

+7
-7
lines changed

src/jekyll/running-on-kubernetes.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -55,11 +55,11 @@ If you wish to use pre-built docker images, you may use the images published in
5555
</tr>
5656
<tr>
5757
<td>PySpark Driver Image</td>
58-
<td><code>kubespark/driver-py:v2.2.0-kubernetes-0.4.0</code></td>
58+
<td><code>kubespark/spark-driver-py:v2.2.0-kubernetes-0.4.0</code></td>
5959
</tr>
6060
<tr>
6161
<td>PySpark Executor Image</td>
62-
<td><code>kubespark/executor-py:v2.2.0-kubernetes-0.4.0</code></td>
62+
<td><code>kubespark/spark-executor-py:v2.2.0-kubernetes-0.4.0</code></td>
6363
</tr>
6464
</table>
6565

@@ -142,8 +142,8 @@ Here is how you would execute a Spark-Pi example:
142142
--kubernetes-namespace <k8s-namespace> \
143143
--conf spark.executor.instances=5 \
144144
--conf spark.app.name=spark-pi \
145-
--conf spark.kubernetes.driver.docker.image=kubespark/driver-py:v2.2.0-kubernetes-0.4.0 \
146-
--conf spark.kubernetes.executor.docker.image=kubespark/executor-py:v2.2.0-kubernetes-0.4.0 \
145+
--conf spark.kubernetes.driver.docker.image=kubespark/spark-driver-py:v2.2.0-kubernetes-0.4.0 \
146+
--conf spark.kubernetes.executor.docker.image=kubespark/spark-executor-py:v2.2.0-kubernetes-0.4.0 \
147147
--jars local:///opt/spark/examples/jars/spark-examples_2.11-2.2.0-k8s-0.4.0.jar \
148148
local:///opt/spark/examples/src/main/python/pi.py 10
149149

@@ -156,15 +156,15 @@ We support this as well, as seen with the following example:
156156
--kubernetes-namespace <k8s-namespace> \
157157
--conf spark.executor.instances=5 \
158158
--conf spark.app.name=spark-pi \
159-
--conf spark.kubernetes.driver.docker.image=kubespark/driver-py:v2.2.0-kubernetes-0.4.0 \
160-
--conf spark.kubernetes.executor.docker.image=kubespark/executor-py:v2.2.0-kubernetes-0.4.0 \
159+
--conf spark.kubernetes.driver.docker.image=kubespark/spark-driver-py:v2.2.0-kubernetes-0.4.0 \
160+
--conf spark.kubernetes.executor.docker.image=kubespark/spark-executor-py:v2.2.0-kubernetes-0.4.0 \
161161
--jars local:///opt/spark/examples/jars/spark-examples_2.11-2.2.0-k8s-0.4.0.jar \
162162
--py-files local:///opt/spark/examples/src/main/python/sort.py \
163163
local:///opt/spark/examples/src/main/python/pi.py 10
164164

165165
166166
You may also customize your Docker images to use different `pip` packages that suit your use-case. As you can see
167-
with the current `driver-py` Docker image we have commented out the current pip module support that you can uncomment
167+
with the current `spark-driver-py` Docker image we have commented out the current pip module support that you can uncomment
168168
to use:
169169

170170
...

0 commit comments

Comments
 (0)