Skip to content

Commit 44719e6

Browse files
Kanwaljit SinghAndrew Or
authored andcommitted
SPARK-2641: Passing num executors to spark arguments from properties file
Since we can set spark executor memory and executor cores using property file, we must also be allowed to set the executor instances. Author: Kanwaljit Singh <[email protected]> Closes #1657 from kjsingh/branch-1.0 and squashes the following commits: d8a5a12 [Kanwaljit Singh] SPARK-2641: Fixing how spark arguments are loaded from properties file for num executors
1 parent e0fc0c5 commit 44719e6

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -105,6 +105,8 @@ private[spark] class SparkSubmitArguments(args: Seq[String]) {
105105
.getOrElse(defaultProperties.get("spark.cores.max").orNull)
106106
name = Option(name).getOrElse(defaultProperties.get("spark.app.name").orNull)
107107
jars = Option(jars).getOrElse(defaultProperties.get("spark.jars").orNull)
108+
numExecutors = Option(numExecutors)
109+
.getOrElse(defaultProperties.get("spark.executor.instances").orNull)
108110

109111
// This supports env vars in older versions of Spark
110112
master = Option(master).getOrElse(System.getenv("MASTER"))

0 commit comments

Comments
 (0)