Skip to content

Commit 1dc9855

Browse files
committed
More doc and cleanup
1 parent 00edfb9 commit 1dc9855

File tree

3 files changed

+7
-3
lines changed

3 files changed

+7
-3
lines changed

core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ private[spark] class SparkSubmitArguments(args: Seq[String]) {
5555
var verbose: Boolean = false
5656
var isPython: Boolean = false
5757
var pyFiles: String = null
58-
var sparkProperties: HashMap[String, String] = new HashMap[String, String]()
58+
val sparkProperties: HashMap[String, String] = new HashMap[String, String]()
5959

6060
parseOpts(args.toList)
6161
loadDefaults()
@@ -178,6 +178,7 @@ private[spark] class SparkSubmitArguments(args: Seq[String]) {
178178
| executorCores $executorCores
179179
| totalExecutorCores $totalExecutorCores
180180
| propertiesFile $propertiesFile
181+
| extraSparkProperties $sparkProperties
181182
| driverMemory $driverMemory
182183
| driverCores $driverCores
183184
| driverExtraClassPath $driverExtraClassPath
@@ -291,7 +292,7 @@ private[spark] class SparkSubmitArguments(args: Seq[String]) {
291292
jars = Utils.resolveURIs(value)
292293
parse(tail)
293294

294-
case ("--conf") :: value :: tail =>
295+
case ("--conf" | "-c") :: value :: tail =>
295296
value.split("=", 2).toSeq match {
296297
case Seq(k, v) => sparkProperties(k) = v
297298
case _ => SparkSubmit.printErrorAndExit(s"Spark config without '=': $value")

docs/configuration.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,8 @@ val sc = new SparkContext(new SparkConf())
4242

4343
Then, you can supply configuration values at runtime:
4444
{% highlight bash %}
45-
./bin/spark-submit --name "My app" --master local[4] myApp.jar --conf spark.shuffle.spill=false
45+
./bin/spark-submit --name "My app" --master local[4] --conf spark.shuffle.spill=false
46+
--conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -Xmn5g" myApp.jar
4647
{% endhighlight %}
4748

4849
The Spark shell and [`spark-submit`](cluster-overview.html#launching-applications-with-spark-submit)

docs/submitting-applications.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -33,6 +33,7 @@ dependencies, and can support different cluster managers and deploy modes that S
3333
--class <main-class>
3434
--master <master-url> \
3535
--deploy-mode <deploy-mode> \
36+
--conf <key>=<value> \
3637
... # other options
3738
<application-jar> \
3839
[application-arguments]
@@ -43,6 +44,7 @@ Some of the commonly used options are:
4344
* `--class`: The entry point for your application (e.g. `org.apache.spark.examples.SparkPi`)
4445
* `--master`: The [master URL](#master-urls) for the cluster (e.g. `spark://23.195.26.187:7077`)
4546
* `--deploy-mode`: Whether to deploy your driver on the worker nodes (`cluster`) or locally as an external client (`client`) (default: `client`)*
47+
* `--conf`: Arbitrary Spark configuration property in key=value format.
4648
* `application-jar`: Path to a bundled jar including your application and all dependencies. The URL must be globally visible inside of your cluster, for instance, an `hdfs://` path or a `file://` path that is present on all nodes.
4749
* `application-arguments`: Arguments passed to the main method of your main class, if any
4850

0 commit comments

Comments
 (0)