Skip to content

Commit c5abb3c

Browse files
gaborgsomogyiMarcelo Vanzin
authored andcommitted
[SPARK-23476][CORE] Generate secret in local mode when authentication on
## What changes were proposed in this pull request? If spark is run with "spark.authenticate=true", then it will fail to start in local mode. This PR generates secret in local mode when authentication on. ## How was this patch tested? Modified existing unit test. Manually started spark-shell. Author: Gabor Somogyi <[email protected]> Closes #20652 from gaborgsomogyi/SPARK-23476.
1 parent 87293c7 commit c5abb3c

File tree

3 files changed

+46
-22
lines changed

3 files changed

+46
-22
lines changed

core/src/main/scala/org/apache/spark/SecurityManager.scala

Lines changed: 11 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -520,19 +520,25 @@ private[spark] class SecurityManager(
520520
*
521521
* If authentication is disabled, do nothing.
522522
*
523-
* In YARN mode, generate a new secret and store it in the current user's credentials.
523+
* In YARN and local mode, generate a new secret and store it in the current user's credentials.
524524
*
525525
* In other modes, assert that the auth secret is set in the configuration.
526526
*/
527527
def initializeAuth(): Unit = {
528+
import SparkMasterRegex._
529+
528530
if (!sparkConf.get(NETWORK_AUTH_ENABLED)) {
529531
return
530532
}
531533

532-
if (sparkConf.get(SparkLauncher.SPARK_MASTER, null) != "yarn") {
533-
require(sparkConf.contains(SPARK_AUTH_SECRET_CONF),
534-
s"A secret key must be specified via the $SPARK_AUTH_SECRET_CONF config.")
535-
return
534+
val master = sparkConf.get(SparkLauncher.SPARK_MASTER, "")
535+
master match {
536+
case "yarn" | "local" | LOCAL_N_REGEX(_) | LOCAL_N_FAILURES_REGEX(_, _) =>
537+
// Secret generation allowed here
538+
case _ =>
539+
require(sparkConf.contains(SPARK_AUTH_SECRET_CONF),
540+
s"A secret key must be specified via the $SPARK_AUTH_SECRET_CONF config.")
541+
return
536542
}
537543

538544
val rnd = new SecureRandom()

core/src/test/scala/org/apache/spark/SecurityManagerSuite.scala

Lines changed: 34 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -440,23 +440,41 @@ class SecurityManagerSuite extends SparkFunSuite with ResetSystemProperties {
440440
assert(keyFromEnv === new SecurityManager(conf2).getSecretKey())
441441
}
442442

443-
test("secret key generation in yarn mode") {
444-
val conf = new SparkConf()
445-
.set(NETWORK_AUTH_ENABLED, true)
446-
.set(SparkLauncher.SPARK_MASTER, "yarn")
447-
val mgr = new SecurityManager(conf)
448-
449-
UserGroupInformation.createUserForTesting("authTest", Array()).doAs(
450-
new PrivilegedExceptionAction[Unit]() {
451-
override def run(): Unit = {
452-
mgr.initializeAuth()
453-
val creds = UserGroupInformation.getCurrentUser().getCredentials()
454-
val secret = creds.getSecretKey(SecurityManager.SECRET_LOOKUP_KEY)
455-
assert(secret != null)
456-
assert(new String(secret, UTF_8) === mgr.getSecretKey())
443+
test("secret key generation") {
444+
Seq(
445+
("yarn", true),
446+
("local", true),
447+
("local[*]", true),
448+
("local[1, 2]", true),
449+
("local-cluster[2, 1, 1024]", false),
450+
("invalid", false)
451+
).foreach { case (master, shouldGenerateSecret) =>
452+
val conf = new SparkConf()
453+
.set(NETWORK_AUTH_ENABLED, true)
454+
.set(SparkLauncher.SPARK_MASTER, master)
455+
val mgr = new SecurityManager(conf)
456+
457+
UserGroupInformation.createUserForTesting("authTest", Array()).doAs(
458+
new PrivilegedExceptionAction[Unit]() {
459+
override def run(): Unit = {
460+
if (shouldGenerateSecret) {
461+
mgr.initializeAuth()
462+
val creds = UserGroupInformation.getCurrentUser().getCredentials()
463+
val secret = creds.getSecretKey(SecurityManager.SECRET_LOOKUP_KEY)
464+
assert(secret != null)
465+
assert(new String(secret, UTF_8) === mgr.getSecretKey())
466+
} else {
467+
intercept[IllegalArgumentException] {
468+
mgr.initializeAuth()
469+
}
470+
intercept[IllegalArgumentException] {
471+
mgr.getSecretKey()
472+
}
473+
}
474+
}
457475
}
458-
}
459-
)
476+
)
477+
}
460478
}
461479

462480
}

docs/security.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ title: Security
66

77
Spark currently supports authentication via a shared secret. Authentication can be configured to be on via the `spark.authenticate` configuration parameter. This parameter controls whether the Spark communication protocols do authentication using the shared secret. This authentication is a basic handshake to make sure both sides have the same shared secret and are allowed to communicate. If the shared secret is not identical they will not be allowed to communicate. The shared secret is created as follows:
88

9-
* For Spark on [YARN](running-on-yarn.html) deployments, configuring `spark.authenticate` to `true` will automatically handle generating and distributing the shared secret. Each application will use a unique shared secret.
9+
* For Spark on [YARN](running-on-yarn.html) and local deployments, configuring `spark.authenticate` to `true` will automatically handle generating and distributing the shared secret. Each application will use a unique shared secret.
1010
* For other types of Spark deployments, the Spark parameter `spark.authenticate.secret` should be configured on each of the nodes. This secret will be used by all the Master/Workers and applications.
1111

1212
## Web UI

0 commit comments

Comments
 (0)