You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/configuration.md
+13-1Lines changed: 13 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -116,7 +116,7 @@ of the most common options to set are:
116
116
Amount of memory to use for the driver process, i.e. where SparkContext is initialized.
117
117
(e.g. <code>512m</code>, <code>2g</code>).
118
118
119
-
<br /><br />Note: this setting only works in <code>cluster</code> mode (e.g. YARN deployment). In <code>client</code> mode (e.g. spark-shell), this setting has no effect. In client mode, driver memory should be configured in the run-time settings ; i.e. --driver-memory 2g.
119
+
<br /><br />Note: setting this with <code>conf.set(...)</code> only works in <code>cluster</code> mode (e.g. YARN deployment). For <code>client</code> driver memory should be configured in the run-time settings ; i.e. --driver-memory 2g or within <code>conf/spark-defaults.conf</code>.
120
120
</td>
121
121
</tr>
122
122
<tr>
@@ -137,6 +137,9 @@ of the most common options to set are:
137
137
Having a high limit may cause out-of-memory errors in driver (depends on spark.driver.memory
138
138
and memory overhead of objects in JVM). Setting a proper limit can protect the driver from
139
139
out-of-memory errors.
140
+
141
+
<br /><br />Note: setting this with <code>conf.set(...)</code> only works in <code>cluster</code> mode (e.g. YARN deployment). For <code>client</code> driver memory should be configured in <code>conf/spark-defaults.conf</code>.
142
+
140
143
</td>
141
144
</tr>
142
145
<tr>
@@ -216,20 +219,27 @@ Apart from these, the following properties are also available, and may be useful
216
219
<td>(none)</td>
217
220
<td>
218
221
A string of extra JVM options to pass to the driver. For instance, GC settings or other logging.
222
+
223
+
<br /><br />Note: setting this with <code>conf.set(...)</code> only works in <code>cluster</code> mode (e.g. YARN deployment). For <code>client</code> driver memory should be configured in <code>conf/spark-defaults.conf</code> or via the run-time settings (See Dynamically Loading Spark Properties).
224
+
219
225
</td>
220
226
</tr>
221
227
<tr>
222
228
<td><code>spark.driver.extraClassPath</code></td>
223
229
<td>(none)</td>
224
230
<td>
225
231
Extra classpath entries to append to the classpath of the driver.
232
+
233
+
<br /><br />Note: setting this with <code>conf.set(...)</code> only works in <code>cluster</code> mode (e.g. YARN deployment). For <code>client</code> driver memory should be configured in <code>conf/spark-defaults.conf</code> or via the run-time settings (See Dynamically Loading Spark Properties).
Set a special library path to use when launching the driver JVM.
241
+
242
+
<br /><br />Note: setting this with <code>conf.set(...)</code> only works in <code>cluster</code> mode (e.g. YARN deployment). For <code>client</code> driver memory should be configured in <code>conf/spark-defaults.conf</code> or via the run-time settings (See Dynamically Loading Spark Properties).
233
243
</td>
234
244
</tr>
235
245
<tr>
@@ -239,6 +249,8 @@ Apart from these, the following properties are also available, and may be useful
239
249
(Experimental) Whether to give user-added jars precedence over Spark's own jars when loading
240
250
classes in the the driver. This feature can be used to mitigate conflicts between Spark's
241
251
dependencies and user dependencies. It is currently an experimental feature.
252
+
253
+
<br /><br />Note: setting this with <code>conf.set(...)</code> only works in <code>cluster</code> mode (e.g. YARN deployment). For <code>client</code> driver memory should be configured in <code>conf/spark-defaults.conf</code> or via the run-time settings (See Dynamically Loading Spark Properties).
0 commit comments