Skip to content

Conversation

@witgo
Copy link
Contributor

@witgo witgo commented May 5, 2014

No description provided.

@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@witgo witgo changed the title Add missing description to spark-env.sh.template SPARK-1756: Add missing description to spark-env.sh.template May 8, 2014
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we want to document this anymore. It's been subsumed by passing --driver-memory to the spark shell or spark-submit tools.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also this isn't in the correct section. But again, I don't think we want it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

./bin/spark-shell --driver-memory 2g =>

/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -cp ::/Users/witgo/work/code/java/spark/dist/conf:/Users/witgo/work/code/java/spark/dist/lib/spark-assembly-1.0.0-SNAPSHOT-hadoop0.23.9.jar -Djava.library.path= -Xms512m -Xmx512m org.apache.spark.deploy.SparkSubmit spark-internal --driver-memory 2g --class org.apache.spark.repl.Main

@pwendell
-Xms512m -Xmx512m is not correct

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a big:
https://github.com/apache/spark/blob/master/bin/spark-submit#L27

This should say SPARK_DRIVER_MEMORY

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If so,

if [ ! -z $DRIVER_MEMORY ] && [ ! -z $DEPLOY_MODE ] && [ $DEPLOY_MODE = "client" ]; then
  export SPARK_MEM=$DRIVER_MEMORY
fi

is not correct

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes,it work for me.
./bin/spark-shell --driver-memory 2g =>

/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -cp ::/Users/witgo/work/code/java/spark/dist/conf:/Users/witgo/work/code/java/spark/dist/lib/spark-assembly-1.0.0-SNAPSHOT-hadoop0.23.9.jar -Djava.library.path= -Xms2g -Xmx2g org.apache.spark.deploy.SparkSubmit spark-internal --driver-memory 2g --class org.apache.spark.repl.Main

But in --driver-memory 2g --class org.apache.spark.repl.Main , --driver-memory 2g is unnecessary

@witgo witgo closed this May 11, 2014
@witgo
Copy link
Contributor Author

witgo commented May 11, 2014

A better solution PR 730

@witgo witgo deleted the spark_env branch May 11, 2014 06:23
helenyugithub pushed a commit to helenyugithub/spark that referenced this pull request Apr 20, 2020
…e plugin that asynchronously backs up shuffle data to remote storage (apache#646)

* Add an implementation of the SPARK-25299 shuffle storage plugin that asynchronously backs up shuffle data to remote storage.

* Don't write dependency reduced pom for async shuffle upload core.

Seems to cause the shade plugin to hang for this particular module for some reason...

* Start adding javadoc

* Rename a class, add more docs

* More docs

* More docs. Rename another s3 -> hadoop reference.

* Remove trailing period

* More documentation.

* S3 -> Hadoop again

* Move a bunch of references  from S3 -> Hadoop or remote

* Fix build

* Add async-shuffle-upload-core to bom

* Try using a create method from the scala object

Something strange with a NoMethodDefError?
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants