Skip to content

Commit aeb2ecc

Browse files
ConeyLiuMarcelo Vanzin
authored andcommitted
[SPARK-20621][DEPLOY] Delete deprecated config parameter in 'spark-env.sh'
## What changes were proposed in this pull request? Currently, `spark.executor.instances` is deprecated in `spark-env.sh`, because we suggest config it in `spark-defaults.conf` or other config file. And also this parameter is useless even if you set it in `spark-env.sh`, so remove it in this patch. ## How was this patch tested? Existing tests. Please review http://spark.apache.org/contributing.html before opening a pull request. Author: Xianyang Liu <[email protected]> Closes #17881 from ConeyLiu/deprecatedParam.
1 parent 58518d0 commit aeb2ecc

File tree

2 files changed

+1
-5
lines changed

2 files changed

+1
-5
lines changed

conf/spark-env.sh.template

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,6 @@
3434

3535
# Options read in YARN client mode
3636
# - HADOOP_CONF_DIR, to point Spark towards Hadoop configuration files
37-
# - SPARK_EXECUTOR_INSTANCES, Number of executors to start (Default: 2)
3837
# - SPARK_EXECUTOR_CORES, Number of cores for the executors (Default: 1).
3938
# - SPARK_EXECUTOR_MEMORY, Memory per Executor (e.g. 1000M, 2G) (Default: 1G)
4039
# - SPARK_DRIVER_MEMORY, Memory for Driver (e.g. 1000M, 2G) (Default: 1G)

resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnSparkHadoopUtil.scala

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -280,10 +280,7 @@ object YarnSparkHadoopUtil {
280280

281281
initialNumExecutors
282282
} else {
283-
val targetNumExecutors =
284-
sys.env.get("SPARK_EXECUTOR_INSTANCES").map(_.toInt).getOrElse(numExecutors)
285-
// System property can override environment variable.
286-
conf.get(EXECUTOR_INSTANCES).getOrElse(targetNumExecutors)
283+
conf.get(EXECUTOR_INSTANCES).getOrElse(numExecutors)
287284
}
288285
}
289286
}

0 commit comments

Comments
 (0)