Skip to content

Commit 59f475c

Browse files
committed
Merge pull request apache#442 from pwendell/standalone
Workers should use working directory as spark home if it's not specified If users don't set SPARK_HOME in their environment file when launching an application, the standalone cluster should default to the spark home of the worker.
2 parents 2a05403 + 00a3f7e commit 59f475c

File tree

1 file changed

+4
-1
lines changed
  • core/src/main/scala/org/apache/spark/deploy/worker

1 file changed

+4
-1
lines changed

core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -209,8 +209,11 @@ private[spark] class Worker(
209209
logWarning("Invalid Master (" + masterUrl + ") attempted to launch executor.")
210210
} else {
211211
logInfo("Asked to launch executor %s/%d for %s".format(appId, execId, appDesc.name))
212+
// TODO (pwendell): We shuld make sparkHome an Option[String] in
213+
// ApplicationDescription to be more explicit about this.
214+
val effectiveSparkHome = Option(execSparkHome_).getOrElse(sparkHome.getAbsolutePath)
212215
val manager = new ExecutorRunner(appId, execId, appDesc, cores_, memory_,
213-
self, workerId, host, new File(execSparkHome_), workDir, akkaUrl, ExecutorState.RUNNING)
216+
self, workerId, host, new File(effectiveSparkHome), workDir, akkaUrl, ExecutorState.RUNNING)
214217
executors(appId + "/" + execId) = manager
215218
manager.start()
216219
coresUsed += cores_

0 commit comments

Comments
 (0)