Commit 59f475c
committed
Merge pull request apache#442 from pwendell/standalone
Workers should use working directory as spark home if it's not specified
If users don't set SPARK_HOME in their environment file when launching an application, the standalone cluster should default to the spark home of the worker.File tree
1 file changed
+4
-1
lines changed- core/src/main/scala/org/apache/spark/deploy/worker
1 file changed
+4
-1
lines changedLines changed: 4 additions & 1 deletion
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
209 | 209 | | |
210 | 210 | | |
211 | 211 | | |
| 212 | + | |
| 213 | + | |
| 214 | + | |
212 | 215 | | |
213 | | - | |
| 216 | + | |
214 | 217 | | |
215 | 218 | | |
216 | 219 | | |
| |||
0 commit comments