Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion bin/spark-shell
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,10 @@ if [ -z "${SPARK_HOME}" ]; then
source "$(dirname "$0")"/find-spark-home
fi

export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]"
export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]

Scala REPL options:
-I <file> preload <file>, enforcing line-by-line interpretation"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

where do we define other options?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tested other options and this one looks only the valid one. I described in PR description.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean, I didn't find

Options:
  --master MASTER_URL         spark://host:port, mesos://host:port, yarn,
                              k8s://https://host:port, or local (Default: local[*]).
  --deploy-mode DEPLOY_MODE   Whether to launch the driver program locally ("client") or
                              on one of the worker machines inside the cluster ("cluster")
                              (Default: client).

in the shell script. Where do we define them?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh haha. Sorry. That's in SparkSubmitArguments.printUsageAndExit.thats why I left #22919 (comment)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shall we also define -i behavior here? I think for now this option is also accepted by the REPL.

Copy link
Member Author

@HyukjinKwon HyukjinKwon Nov 5, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea .. but -i doesn't handle implicits like toDF or symbols which are pretty basic ones (unless it's explicitly imported within user's program). I think we should better avoid to document it for now.


# SPARK-4161: scala does not assume use of the java classpath,
# so we need to add the "-Dscala.usejavacp=true" flag manually. We
Expand Down
8 changes: 7 additions & 1 deletion bin/spark-shell2.cmd
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,13 @@ rem
rem Figure out where the Spark framework is installed
call "%~dp0find-spark-home.cmd"

set _SPARK_CMD_USAGE=Usage: .\bin\spark-shell.cmd [options]
set LF=^


rem two empty lines are required
set _SPARK_CMD_USAGE=Usage: .\bin\spark-shell.cmd [options]^%LF%%LF%^%LF%%LF%^
Scala REPL options:^%LF%%LF%^
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There seems no claver way then this to set newlines in variables in batch files.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Script specific information looks included in _SPARK_CMD_USAGE - looks it's appropriate place then somewhere in SparkSubmitArguments.printUsageAndExit.

-I ^<file^> preload ^<file^>, enforcing line-by-line interpretation

rem SPARK-4161: scala does not assume use of the java classpath,
rem so we need to add the "-Dscala.usejavacp=true" flag manually. We
Expand Down