Skip to content

Commit 956362d

Browse files
committed
Don't need to pass spark.executor.uri into the spark shell
At least that's how I read SparkILoop#createSparkContext()
1 parent de3353b commit 956362d

File tree

1 file changed

+3
-4
lines changed

1 file changed

+3
-4
lines changed

docs/running-on-mesos.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -102,7 +102,7 @@ The driver also needs some configuration in `spark-env.sh` to interact properly
102102
instructions above. On Mac OS X, the library is called `libmesos.dylib` instead of
103103
`libmesos.so`.
104104
* `export SPARK_EXECUTOR_URI=<path to spark-{{site.SPARK_VERSION}}.tar.gz uploaded above>`.
105-
2. Also set `spark.executor.uri` to spark-{{site.SPARK_VERSION}}.tar.gz
105+
2. Also set `spark.executor.uri` to <path to spark-{{site.SPARK_VERSION}}.tar.gz>
106106

107107
Now when starting a Spark application against the cluster, pass a `mesos://`
108108
or `zk://` URL as the master when creating a `SparkContext`. For example:
@@ -115,11 +115,10 @@ val conf = new SparkConf()
115115
val sc = new SparkContext(conf)
116116
{% endhighlight %}
117117

118-
To set `spark.executor.uri` for use in a Spark shell, set it through the
119-
`SPARK_JAVA_OPTS` environment variable:
118+
When running a shell the `spark.executor.uri` parameter is inherited from `SPARK_EXECUTOR_URI`, so
119+
it does not need to be redundantly passed in as a system property.
120120

121121
{% highlight shell %}
122-
export SPARK_JAVA_OPTS="-Dspark.executor.uri=hdfs:///path/to/spark-{{site.SPARK_VERSION}}.tar.gz"
123122
./bin/spark-shell --master mesos://host:5050
124123
{% endhighlight %}
125124

0 commit comments

Comments
 (0)