Skip to content

Conversation

@zjffdu
Copy link
Contributor

@zjffdu zjffdu commented Sep 20, 2016

What changes were proposed in this pull request?

It is mostly for yarn mode, standalone mode don't need to distribute resources (sparkr.zip, pyspark.zip and etc) if I understand correctly. Add 2 options spark.usePython and spark.useR, so that any project using both sparkR and pyspark can leverage these 2 options.

How was this patch tested?

Use the following command to launch SparkPi and notice pyspark.zip, sparkr.zip and py4j are all distirbuted to executors.

bin/spark-submit --master yarn-client --conf spark.useR=true --conf spark.usePython=true --class org.apache.spark.examples.SparkPi examples/target/original-spark-examples_2.11-2.1.0-SNAPSHOT.jar

Client output

16/09/20 16:25:15 INFO Client: Uploading resource file:/private/var/folders/dp/hmchg5dd3vbcvds26q91spdw0000gp/T/spark-1fa45f53-e75e-4671-bf75-1ef554a3dda5/__spark_libs__3260692671624418275.zip -> hdfs://localhost:9009/user/jzhang/.sparkStaging/application_1474162755082_0035/__spark_libs__3260692671624418275.zip
16/09/20 16:25:16 INFO Client: Uploading resource file:/Users/jzhang/github/spark/R/lib/sparkr.zip#sparkr -> hdfs://localhost:9009/user/jzhang/.sparkStaging/application_1474162755082_0035/sparkr.zip
16/09/20 16:25:17 INFO Client: Uploading resource file:/Users/jzhang/github/spark/python/lib/pyspark.zip -> hdfs://localhost:9009/user/jzhang/.sparkStaging/application_1474162755082_0035/pyspark.zip
16/09/20 16:25:17 INFO Client: Uploading resource file:/Users/jzhang/github/spark/python/lib/py4j-0.10.3-src.zip -> hdfs://localhost:9009/user/jzhang/.sparkStaging/application_1474162755082_0035/py4j-0.10.3-src.zip
16/09/20 16:25:17 INFO Client: Uploading resource file:/private/var/folders/dp/hmchg5dd3vbcvds26q91spdw0000gp/T/spark-1fa45f53-e75e-4671-bf75-1ef554a3dda5/__spark_conf__2718308972579262508.zip -> hdfs://localhost:9009/user/jzhang/.sparkStaging/application_1474162755082_0035/__spark_conf__.zip

@SparkQA
Copy link

SparkQA commented Sep 20, 2016

Test build #65644 has finished for PR 15159 at commit 2271887.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@felixcheung
Copy link
Member

Do you have a test or use case that have both Python and R code? I'm not quite sure that actually work..

@zjffdu
Copy link
Contributor Author

zjffdu commented Sep 21, 2016

@felixcheung Thanks for review, as I mention in the jira description, zeppelin/livy would use both pyspark and sparkR. I try this PR with zeppelin (but needs some code change in zeppelin), and it works.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so now there's an isPython and usePython? Can you just use the old one?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

isPython and usePython has different semantic. isPython means it is pyspark application and the primary resource should be python script or pyspark-shell, and we use isPython to figure out the mainClass. While usePython doesn't mean it is pyspark application ( it could be scala application but use pyspark internally ), and it is not related with mainClass.

@zjffdu
Copy link
Contributor Author

zjffdu commented Sep 23, 2016

Add @rxin @davies @JoshRosen @shivaram for more feedback.

@holdenk
Copy link
Contributor

holdenk commented Oct 7, 2016

Just a heads up this has conflicts with master so it might be good to update (since I know a lot of reviewers use the spark-pr dashboard and maybe skip PRs which aren't mergable).

@holdenk
Copy link
Contributor

holdenk commented Oct 7, 2016

@zjffdu so is the intent of this to allow people to use PySpark from Scala Spark applications?

… for applications that use both pyspark and sparkr
@zjffdu
Copy link
Contributor Author

zjffdu commented Oct 9, 2016

@holdenk that's correct.

@SparkQA
Copy link

SparkQA commented Oct 9, 2016

Test build #66594 has finished for PR 15159 at commit 522e3e8.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@vanzin
Copy link
Contributor

vanzin commented Dec 15, 2016

I wonder if there isn't a better way to handle this without having to add more configs.

e.g., just distribute things if the user asks for it. For example, if --py-files is provided, distribute those files and set env variables as if the application were a pyspark app, even if it isn't. In YARN's Client.scala, distribute the python libs with the app when creating the jars archive - it's a tiny extra overhead given the size of the other jars. And pretty much deprecate PYSPARK_ARCHIVES_PATH - users can add those to spark.yarn.jars now and achieve the same effect.

Similar things for R, although I'm not really familiar with that path.

I might be overseeing something, but I think it would be nice to avoid adding more config options if possible.

@HyukjinKwon
Copy link
Member

(gentle ping @zjffdu)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants