Skip to content

Commit 85f7a08

Browse files
author
Ilya Ganelin
committed
Reworded note
1 parent 5889d43 commit 85f7a08

File tree

1 file changed

+17
-8
lines changed

1 file changed

+17
-8
lines changed

docs/configuration.md

Lines changed: 17 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -116,8 +116,10 @@ of the most common options to set are:
116116
Amount of memory to use for the driver process, i.e. where SparkContext is initialized.
117117
(e.g. <code>512m</code>, <code>2g</code>).
118118

119-
<br /><em>Note:</em> setting this with <code>conf.set(...)</code> only works in <code>cluster</code> mode (e.g. YARN deployment). For <code>client</code> driver memory should be configured in the run-time settings ; i.e. --driver-memory 2g or within <code>conf/spark-defaults.conf</code>.
120-
</td>
119+
<br /><em>Note:</em> In client mode, this config must not be set through the <code>SparkConf</code>
120+
directly in your application, because the driver JVM has already started at that point.
121+
Instead, please set this through the <code>--driver-memory</code> command line option
122+
or in your default properties file.</td>
121123
</tr>
122124
<tr>
123125
<td><code>spark.executor.memory</code></td>
@@ -138,8 +140,9 @@ of the most common options to set are:
138140
and memory overhead of objects in JVM). Setting a proper limit can protect the driver from
139141
out-of-memory errors.
140142

141-
<br /><em>Note:</em> setting this with <code>conf.set(...)</code> only works in <code>cluster</code> mode (e.g. YARN deployment). For <code>client</code> driver memory should be configured in <code>conf/spark-defaults.conf</code>.
142-
143+
<br /><em>Note:</em> In client mode, this config must not be set through the <code>SparkConf</code>
144+
directly in your application, because the driver JVM has already started at that point.
145+
Instead, please set this through the default properties file.</td>
143146
</td>
144147
</tr>
145148
<tr>
@@ -220,8 +223,10 @@ Apart from these, the following properties are also available, and may be useful
220223
<td>
221224
A string of extra JVM options to pass to the driver. For instance, GC settings or other logging.
222225

223-
<br /><em>Note:</em> setting this with <code>conf.set(...)</code> only works in <code>cluster</code> mode (e.g. YARN deployment). For <code>client</code> driver memory should be configured in <code>conf/spark-defaults.conf</code> or via the run-time settings (See Dynamically Loading Spark Properties).
224-
226+
<br /><em>Note:</em> In client mode, this config must not be set through the <code>SparkConf</code>
227+
directly in your application, because the driver JVM has already started at that point.
228+
Instead, please set this through the command line option (see Dynamically Loading Spark Properties)
229+
or in your default properties file.</td>
225230
</td>
226231
</tr>
227232
<tr>
@@ -230,7 +235,9 @@ Apart from these, the following properties are also available, and may be useful
230235
<td>
231236
Extra classpath entries to append to the classpath of the driver.
232237

233-
<br /><em>Note:</em> setting this with <code>conf.set(...)</code> only works in <code>cluster</code> mode (e.g. YARN deployment). For <code>client</code> driver memory should be configured in <code>conf/spark-defaults.conf</code> or via the run-time settings (See Dynamically Loading Spark Properties).
238+
<br /><em>Note:</em> In client mode, this config must not be set through the <code>SparkConf</code>
239+
directly in your application, because the driver JVM has already started at that point.
240+
Instead, please set this through the default properties file.</td>
234241
</td>
235242
</tr>
236243
<tr>
@@ -239,7 +246,9 @@ Apart from these, the following properties are also available, and may be useful
239246
<td>
240247
Set a special library path to use when launching the driver JVM.
241248

242-
<br /><em>Note:</em> setting this with <code>conf.set(...)</code> only works in <code>cluster</code> mode (e.g. YARN deployment). For <code>client</code> driver memory should be configured in <code>conf/spark-defaults.conf</code> or via the run-time settings (See Dynamically Loading Spark Properties).
249+
<br /><em>Note:</em> In client mode, this config must not be set through the <code>SparkConf</code>
250+
directly in your application, because the driver JVM has already started at that point.
251+
Instead, please set this through the default properties file.</td>
243252
</td>
244253
</tr>
245254
<tr>

0 commit comments

Comments
 (0)