Skip to content

Commit a3978f3

Browse files
marmbruspwendell
authored andcommitted
[SPARK-5078] Optionally read from SPARK_LOCAL_HOSTNAME
Current spark lets you set the ip address using SPARK_LOCAL_IP, but then this is given to akka after doing a reverse DNS lookup. This makes it difficult to run spark in Docker. You can already change the hostname that is used programmatically, but it would be nice to be able to do this with an environment variable as well. Author: Michael Armbrust <[email protected]> Closes apache#3893 from marmbrus/localHostnameEnv and squashes the following commits: 85045b6 [Michael Armbrust] Optionally read from SPARK_LOCAL_HOSTNAME
1 parent 13e610b commit a3978f3

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

core/src/main/scala/org/apache/spark/util/Utils.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -701,7 +701,7 @@ private[spark] object Utils extends Logging {
701701
}
702702
}
703703

704-
private var customHostname: Option[String] = None
704+
private var customHostname: Option[String] = sys.env.get("SPARK_LOCAL_HOSTNAME")
705705

706706
/**
707707
* Allow setting a custom host name because when we run on Mesos we need to use the same

0 commit comments

Comments
 (0)