Skip to content

Conversation

@falaki
Copy link
Contributor

@falaki falaki commented Sep 18, 2015

In RUtils.sparkRPackagePath() we

  1. Call sys.props("spark.submit.deployMode") which returns null if spark.submit.deployMode is not suet
  2. Call sparkConf.get("spark.submit.deployMode") which throws NoSuchElementException if spark.submit.deployMode is not set. This patch simply passes a default value ("cluster") for spark.submit.deployMode.

cc @rxin

@falaki
Copy link
Contributor Author

falaki commented Sep 19, 2015

cc @shivaram

@shivaram
Copy link
Contributor

Any reason this would not be set ? My assumption was that all spark-submit applications had this -- so I guess this is for applications not using spark-submit ?

cc @sun-rui who added the function

@falaki
Copy link
Contributor Author

falaki commented Sep 19, 2015

Yes, if an application starts the JVM manually (not using spark-submit) this call will fail. The change just adds safety without compromising functionality or correctness.

@SparkQA
Copy link

SparkQA commented Sep 19, 2015

Test build #42699 has finished for PR 8832 at commit 8b80886.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if anything the default deploy mode should be client to be consistent with assumptions in the code elsewhere

@sun-rui
Copy link
Contributor

sun-rui commented Sep 21, 2015

@falaki , are you using RStudio, where spark-submit is not involved?

I don't think we can simply set a default value. Because it is possible that the default mode does not match spark master. For example, if the spark master is 'yarn-cluster', the default mode is "client", that does not match.

The correct policy is:
if the master does not contain deploy mode information, the deploy mode is "client",
else get deploy mode from the master URL.

I think this issue is not related to SparkR only. @andrewor14, any policy when handling default configurations when a spark-application is not launched via spark-submit? Should we extract some logic from spark-submit so that the logic can be called in non-spark-submit cases to keep consistency?

@andrewor14
Copy link
Contributor

retest this please

@andrewor14
Copy link
Contributor

if the master does not contain deploy mode information, the deploy mode is "client",
else get deploy mode from the master URL.

We're actually deprecating the master URLs yarn-client and yarn-cluster (#8385). In general I find it pretty confusing to have the deploy mode embedded in the master URL.

I think this issue is not related to SparkR only. @andrewor14, any policy when handling default configurations when a spark-application is not launched via spark-submit? Should we extract some logic from spark-submit so that the logic can be called in non-spark-submit cases to keep consistency?

We do have the launcher library, but beyond that I don't think this is something we support because it gets pretty difficult to maintain. AFAIK deploy mode is the only such config so this should be fine.

@andrewor14
Copy link
Contributor

LGTM merging once tests pass

@SparkQA
Copy link

SparkQA commented Sep 22, 2015

Test build #42800 has finished for PR 8832 at commit a643281.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@rxin
Copy link
Contributor

rxin commented Sep 22, 2015

I've merged this.

@asfgit asfgit closed this in c986e93 Sep 22, 2015
asfgit pushed a commit that referenced this pull request Sep 22, 2015
…s set

In ```RUtils.sparkRPackagePath()``` we
1. Call ``` sys.props("spark.submit.deployMode")``` which returns null if ```spark.submit.deployMode``` is not suet
2. Call ``` sparkConf.get("spark.submit.deployMode")``` which throws ```NoSuchElementException``` if ```spark.submit.deployMode``` is not set. This patch simply passes a default value ("cluster") for ```spark.submit.deployMode```.

cc rxin

Author: Hossein <[email protected]>

Closes #8832 from falaki/SPARK-10711.

(cherry picked from commit c986e93)
Signed-off-by: Reynold Xin <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants