Skip to content

Commit 6bddc40

Browse files
Ilya GanelinAndrew Or
authored andcommitted
SPARK-5570: No docs stating that `new SparkConf().set("spark.driver.memory", ...) will not work
I've updated documentation to reflect true behavior of this setting in client vs. cluster mode. Author: Ilya Ganelin <[email protected]> Closes apache#4665 from ilganeli/SPARK-5570 and squashes the following commits: 5d1c8dd [Ilya Ganelin] Added example configuration code a51700a [Ilya Ganelin] Getting rid of extra spaces 85f7a08 [Ilya Ganelin] Reworded note 5889d43 [Ilya Ganelin] Formatting adjustment f149ba1 [Ilya Ganelin] Minor updates 1fec7a5 [Ilya Ganelin] Updated to add clarification for other driver properties db47595 [Ilya Ganelin] Slight formatting update c899564 [Ilya Ganelin] Merge remote-tracking branch 'upstream/master' into SPARK-5570 17b751d [Ilya Ganelin] Updated documentation for driver-memory to reflect its true behavior in client vs cluster mode
1 parent 34b7c35 commit 6bddc40

File tree

1 file changed

+22
-1
lines changed

1 file changed

+22
-1
lines changed

docs/configuration.md

Lines changed: 22 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -115,7 +115,11 @@ of the most common options to set are:
115115
<td>
116116
Amount of memory to use for the driver process, i.e. where SparkContext is initialized.
117117
(e.g. <code>512m</code>, <code>2g</code>).
118-
</td>
118+
119+
<br /><em>Note:</em> In client mode, this config must not be set through the <code>SparkConf</code>
120+
directly in your application, because the driver JVM has already started at that point.
121+
Instead, please set this through the <code>--driver-memory</code> command line option
122+
or in your default properties file.</td>
119123
</tr>
120124
<tr>
121125
<td><code>spark.executor.memory</code></td>
@@ -214,20 +218,35 @@ Apart from these, the following properties are also available, and may be useful
214218
<td>(none)</td>
215219
<td>
216220
A string of extra JVM options to pass to the driver. For instance, GC settings or other logging.
221+
222+
<br /><em>Note:</em> In client mode, this config must not be set through the <code>SparkConf</code>
223+
directly in your application, because the driver JVM has already started at that point.
224+
Instead, please set this through the <code>--driver-java-options</code> command line option or in
225+
your default properties file.</td>
217226
</td>
218227
</tr>
219228
<tr>
220229
<td><code>spark.driver.extraClassPath</code></td>
221230
<td>(none)</td>
222231
<td>
223232
Extra classpath entries to append to the classpath of the driver.
233+
234+
<br /><em>Note:</em> In client mode, this config must not be set through the <code>SparkConf</code>
235+
directly in your application, because the driver JVM has already started at that point.
236+
Instead, please set this through the <code>--driver-class-path</code> command line option or in
237+
your default properties file.</td>
224238
</td>
225239
</tr>
226240
<tr>
227241
<td><code>spark.driver.extraLibraryPath</code></td>
228242
<td>(none)</td>
229243
<td>
230244
Set a special library path to use when launching the driver JVM.
245+
246+
<br /><em>Note:</em> In client mode, this config must not be set through the <code>SparkConf</code>
247+
directly in your application, because the driver JVM has already started at that point.
248+
Instead, please set this through the <code>--driver-library-path</code> command line option or in
249+
your default properties file.</td>
231250
</td>
232251
</tr>
233252
<tr>
@@ -237,6 +256,8 @@ Apart from these, the following properties are also available, and may be useful
237256
(Experimental) Whether to give user-added jars precedence over Spark's own jars when loading
238257
classes in the the driver. This feature can be used to mitigate conflicts between Spark's
239258
dependencies and user dependencies. It is currently an experimental feature.
259+
260+
This is used in cluster mode only.
240261
</td>
241262
</tr>
242263
<tr>

0 commit comments

Comments
 (0)