-
Notifications
You must be signed in to change notification settings - Fork 28.9k
SPARK-1756: Add missing description to spark-env.sh.template #646
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Can one of the admins verify this patch? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we want to document this anymore. It's been subsumed by passing --driver-memory to the spark shell or spark-submit tools.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also this isn't in the correct section. But again, I don't think we want it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
./bin/spark-shell --driver-memory 2g =>
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -cp ::/Users/witgo/work/code/java/spark/dist/conf:/Users/witgo/work/code/java/spark/dist/lib/spark-assembly-1.0.0-SNAPSHOT-hadoop0.23.9.jar -Djava.library.path= -Xms512m -Xmx512m org.apache.spark.deploy.SparkSubmit spark-internal --driver-memory 2g --class org.apache.spark.repl.Main
@pwendell
-Xms512m -Xmx512m is not correct
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is a big:
https://github.com/apache/spark/blob/master/bin/spark-submit#L27
This should say SPARK_DRIVER_MEMORY
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If so,
if [ ! -z $DRIVER_MEMORY ] && [ ! -z $DEPLOY_MODE ] && [ $DEPLOY_MODE = "client" ]; then
export SPARK_MEM=$DRIVER_MEMORY
fi
is not correct
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes - does this work for you?:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes,it work for me.
./bin/spark-shell --driver-memory 2g =>
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -cp ::/Users/witgo/work/code/java/spark/dist/conf:/Users/witgo/work/code/java/spark/dist/lib/spark-assembly-1.0.0-SNAPSHOT-hadoop0.23.9.jar -Djava.library.path= -Xms2g -Xmx2g org.apache.spark.deploy.SparkSubmit spark-internal --driver-memory 2g --class org.apache.spark.repl.Main
But in --driver-memory 2g --class org.apache.spark.repl.Main , --driver-memory 2g is unnecessary
|
A better solution PR 730 |
…e plugin that asynchronously backs up shuffle data to remote storage (apache#646) * Add an implementation of the SPARK-25299 shuffle storage plugin that asynchronously backs up shuffle data to remote storage. * Don't write dependency reduced pom for async shuffle upload core. Seems to cause the shade plugin to hang for this particular module for some reason... * Start adding javadoc * Rename a class, add more docs * More docs * More docs. Rename another s3 -> hadoop reference. * Remove trailing period * More documentation. * S3 -> Hadoop again * Move a bunch of references from S3 -> Hadoop or remote * Fix build * Add async-shuffle-upload-core to bom * Try using a create method from the scala object Something strange with a NoMethodDefError?
No description provided.