Skip to content

Commit b4b496c

Browse files
committed
spark-defaults.properties -> spark-defaults.conf
1 parent 0086939 commit b4b496c

File tree

5 files changed

+6
-6
lines changed

5 files changed

+6
-6
lines changed

.rat-excludes

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ RELEASE
1111
control
1212
docs
1313
fairscheduler.xml.template
14-
spark-defaults.properties.template
14+
spark-defaults.conf.template
1515
log4j.properties
1616
log4j.properties.template
1717
metrics.properties.template

core/src/main/scala/org/apache/spark/SparkConf.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -244,7 +244,7 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging {
244244
|This has undefined behavior when running on a cluster and is deprecated in Spark 1.0+.
245245
|
246246
|Please instead use:
247-
| - ./spark-submit with conf/spark-defaults.properties to set defaults for an application
247+
| - ./spark-submit with conf/spark-defaults.conf to set defaults for an application
248248
| - ./spark-submit with --driver-java-options to set -X options for a driver
249249
| - spark.executor.extraJavaOptions to set -X options for executors
250250
| - SPARK_DAEMON_OPTS to set java options for standalone daemons (i.e. master, worker)

core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -82,7 +82,7 @@ private[spark] class SparkSubmitArguments(args: Array[String]) {
8282
if (propertiesFile == null) {
8383
sys.env.get("SPARK_HOME").foreach { sparkHome =>
8484
val sep = File.separator
85-
val defaultPath = s"${sparkHome}${sep}conf${sep}spark-defaults.properties"
85+
val defaultPath = s"${sparkHome}${sep}conf${sep}spark-defaults.conf"
8686
val file = new File(defaultPath)
8787
if (file.exists()) {
8888
propertiesFile = file.getAbsolutePath
@@ -283,7 +283,7 @@ private[spark] class SparkSubmitArguments(args: Array[String]) {
283283
| --files FILES Comma separated list of files to be placed in the working dir
284284
| of each executor.
285285
| --properties-file FILE Path to a file from which to load extra properties. If not
286-
| specified, this will look for conf/spark-defaults.properties.
286+
| specified, this will look for conf/spark-defaults.conf.
287287
|
288288
| --driver-memory MEM Memory for driver (e.g. 1000M, 2G) (Default: 512M).
289289
| --driver-java-options Extra Java options to pass to the driver

docs/cluster-overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -103,7 +103,7 @@ HADOOP_CONF_DIR=XX /bin/spark-submit my-app.jar \
103103

104104
The `spark-submit` script can load default `SparkConf` values from a properties file and pass them
105105
onto your application. By default it will read configuration options from
106-
`conf/spark-defaults.properties`. Any values specified in the file will be passed on to the
106+
`conf/spark-defaults.conf`. Any values specified in the file will be passed on to the
107107
application when run. They can obviate the need for certain flags to `spark-submit`: for
108108
instance, if `spark.master` property is set, you can safely omit the
109109
`--master` flag from `spark-submit`. In general, configuration values explicitly set on a

docs/configuration.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -659,7 +659,7 @@ Apart from these, the following properties are also available, and may be useful
659659
A string of extra JVM options to pass to executors. For instance, GC settings or other
660660
logging. Note that it is illegal to set Spark properties or heap size settings with this
661661
option. Spark properties should be set using a SparkConf object or the
662-
spark-defaults.properties file used with the spark-submit script. Heap size settings can be set
662+
spark-defaults.conf file used with the spark-submit script. Heap size settings can be set
663663
with spark.executor.memory.
664664
</td>
665665
</tr>

0 commit comments

Comments
 (0)