Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions conf/spark-env.sh.template
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@
# - SPARK_WORKER_INSTANCES, to set the number of worker processes per node
# - SPARK_WORKER_DIR, to set the working directory of worker processes
# - SPARK_WORKER_OPTS, to set config properties only for the worker (e.g. "-Dx=y")
# - SPARK_DRIVER_MEMORY, Memory for driver (e.g. 1000M, 2G) (Default: 512 Mb)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we want to document this anymore. It's been subsumed by passing --driver-memory to the spark shell or spark-submit tools.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also this isn't in the correct section. But again, I don't think we want it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

./bin/spark-shell --driver-memory 2g =>

/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -cp ::/Users/witgo/work/code/java/spark/dist/conf:/Users/witgo/work/code/java/spark/dist/lib/spark-assembly-1.0.0-SNAPSHOT-hadoop0.23.9.jar -Djava.library.path= -Xms512m -Xmx512m org.apache.spark.deploy.SparkSubmit spark-internal --driver-memory 2g --class org.apache.spark.repl.Main

@pwendell
-Xms512m -Xmx512m is not correct

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a big:
https://github.com/apache/spark/blob/master/bin/spark-submit#L27

This should say SPARK_DRIVER_MEMORY

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If so,

if [ ! -z $DRIVER_MEMORY ] && [ ! -z $DEPLOY_MODE ] && [ $DEPLOY_MODE = "client" ]; then
  export SPARK_MEM=$DRIVER_MEMORY
fi

is not correct

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes,it work for me.
./bin/spark-shell --driver-memory 2g =>

/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -cp ::/Users/witgo/work/code/java/spark/dist/conf:/Users/witgo/work/code/java/spark/dist/lib/spark-assembly-1.0.0-SNAPSHOT-hadoop0.23.9.jar -Djava.library.path= -Xms2g -Xmx2g org.apache.spark.deploy.SparkSubmit spark-internal --driver-memory 2g --class org.apache.spark.repl.Main

But in --driver-memory 2g --class org.apache.spark.repl.Main , --driver-memory 2g is unnecessary

# - SPARK_HISTORY_OPTS, to set config properties only for the history server (e.g. "-Dx=y")
# - SPARK_DAEMON_OPTS, to set config properties for all daemons (e.g. "-Dx=y")
# - SPARK_PUBLIC_DNS, to set the public dns name of the master or workers