-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-4563][core] Allow driver to advertise a different network address. #15120
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -19,6 +19,7 @@ package org.apache.spark.internal | |
|
|
||
| import org.apache.spark.launcher.SparkLauncher | ||
| import org.apache.spark.network.util.ByteUnit | ||
| import org.apache.spark.util.Utils | ||
|
|
||
| package object config { | ||
|
|
||
|
|
@@ -143,4 +144,23 @@ package object config { | |
| .internal() | ||
| .stringConf | ||
| .createWithDefaultString("AES/CTR/NoPadding") | ||
|
|
||
| private[spark] val DRIVER_HOST_ADDRESS = ConfigBuilder("spark.driver.host") | ||
| .doc("Address of driver endpoints.") | ||
| .stringConf | ||
| .createWithDefault(Utils.localHostName()) | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is a broken change. If a user uses
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It should, as long as he doesn't set "spark.driver.bindAddress" - which is a new setting that is being added to override that behavior.
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Maybe I miss something.
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ah I see what you mean. I might have inverted the config resolution order... let me take a look. |
||
|
|
||
| private[spark] val DRIVER_BIND_ADDRESS = ConfigBuilder("spark.driver.bindAddress") | ||
| .doc("Address where to bind network listen sockets on the driver.") | ||
| .fallbackConf(DRIVER_HOST_ADDRESS) | ||
|
|
||
| private[spark] val BLOCK_MANAGER_PORT = ConfigBuilder("spark.blockManager.port") | ||
| .doc("Port to use for the block manager when a more specific setting is not provided.") | ||
| .intConf | ||
| .createWithDefault(0) | ||
|
|
||
| private[spark] val DRIVER_BLOCK_MANAGER_PORT = ConfigBuilder("spark.driver.blockManager.port") | ||
| .doc("Port to use for the block managed on the driver.") | ||
| .fallbackConf(BLOCK_MANAGER_PORT) | ||
|
|
||
| } | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -2079,9 +2079,9 @@ private[spark] object Utils extends Logging { | |
| case e: Exception if isBindCollision(e) => | ||
| if (offset >= maxRetries) { | ||
| val exceptionMessage = s"${e.getMessage}: Service$serviceString failed after " + | ||
| s"$maxRetries retries! Consider explicitly setting the appropriate port for the " + | ||
| s"service$serviceString (for example spark.ui.port for SparkUI) to an available " + | ||
| "port or increasing spark.port.maxRetries." | ||
| s"$maxRetries retries (starting from $startPort)! Consider explicitly setting " + | ||
| s"the appropriate port for the service$serviceString (for example spark.ui.port " + | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. nit: since you are touching this, could you add a space between
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The space is actually in
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Got it. |
||
| s"for SparkUI) to an available port or increasing spark.port.maxRetries." | ||
| val exception = new BindException(exceptionMessage) | ||
| // restore original stack trace | ||
| exception.setStackTrace(e.getStackTrace) | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we need to backport this fix to 2.0.1?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, this code is only in master.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We are facing this issue with Spark 1.6 . Are we going to backport this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What issue? The code you're commenting on does not exist in 1.6. If you're having issues, please ask questions on the mailing lists or use the bug tracker.