Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 8 additions & 2 deletions bin/pyspark
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,8 @@ FWDIR="$(cd `dirname $0`/..; pwd)"
# Export this as SPARK_HOME
export SPARK_HOME="$FWDIR"

source $FWDIR/bin/utils.sh

SCALA_VERSION=2.10

if [[ "$@" = *--help ]] || [[ "$@" = *-h ]]; then
Expand Down Expand Up @@ -67,9 +69,10 @@ fi
# We export Spark submit arguments as an environment variable because shell.py must run as a
# PYTHONSTARTUP script, which does not take in arguments. This is required for IPython notebooks.

gatherSparkSubmitOpts $@
PYSPARK_SUBMIT_ARGS=""
whitespace="[[:space:]]"
for i in "$@"; do
for i in ${SUBMISSION_OPTS[@]}; do
if [[ $i =~ \" ]]; then i=$(echo $i | sed 's/\"/\\\"/g'); fi
if [[ $i =~ $whitespace ]]; then i=\"$i\"; fi
PYSPARK_SUBMIT_ARGS="$PYSPARK_SUBMIT_ARGS $i"
Expand All @@ -90,7 +93,10 @@ fi
if [[ "$1" =~ \.py$ ]]; then
echo -e "\nWARNING: Running python applications through ./bin/pyspark is deprecated as of Spark 1.0." 1>&2
echo -e "Use ./bin/spark-submit <python file>\n" 1>&2
exec $FWDIR/bin/spark-submit "$@"
primary=$1
shift
gatherSparkSubmitOpts $@
exec $FWDIR/bin/spark-submit ${SUBMISSION_OPTS[@]} $primary ${APPLICATION_OPTS[@]}
else
# Only use ipython if no command line arguments were provided [SPARK-1134]
if [[ "$IPYTHON" = "1" ]]; then
Expand Down
9 changes: 6 additions & 3 deletions bin/spark-shell
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,10 @@ if [[ "$@" = *--help ]] || [[ "$@" = *-h ]]; then
exit 0
fi

function main(){
source $FWDIR/bin/utils.sh
gatherSparkSubmitOpts $@

function main() {
if $cygwin; then
# Workaround for issue involving JLine and Cygwin
# (see http://sourceforge.net/p/jline/bugs/40/).
Expand All @@ -46,11 +49,11 @@ function main(){
# (see https://github.com/sbt/sbt/issues/562).
stty -icanon min 1 -echo > /dev/null 2>&1
export SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Djline.terminal=unix"
$FWDIR/bin/spark-submit --class org.apache.spark.repl.Main spark-shell "$@"
$FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SUBMISSION_OPTS[@]} spark-shell ${APPLICATION_OPTS[@]}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this handle quoted strings, e.g. --name "awesome app"? You may need to put double quotes around the argument lists.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Confirmed it doesn't, need to add similar logic as the for loop in pyspark to handle quoted arguments.

stty icanon echo > /dev/null 2>&1
else
export SPARK_SUBMIT_OPTS
$FWDIR/bin/spark-submit --class org.apache.spark.repl.Main spark-shell "$@"
$FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SUBMISSION_OPTS[@]} spark-shell ${APPLICATION_OPTS[@]}
fi
}

Expand Down
56 changes: 56 additions & 0 deletions bin/utils.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
#!/usr/bin/env bash

#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

# Gather all all spark-submit options into SUBMISSION_OPTS
function gatherSparkSubmitOpts() {
SUBMISSION_OPTS=()
APPLICATION_OPTS=()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SUBMISSION_OPTS sounds a little strange to me. At the same time I realize we already have a SPARK_SUBMIT_OPTS elsewhere. How about calling these two SPARK_SUBMIT_ARGS and SPARK_APPLICATION_ARGS instead?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also used SPARK_SUBMIT_ARGS at first, but there is already a SPARK_SUBMIT_OPTS env var used in PySpark. These two can be potentially confusing, that's why I resorted to SUBMISSION_OPTS at last.

while (($#)); do
case $1 in
--master | --deploy-mode | --class | --name | --jars | --py-files | --files)
;&

--conf | --properties-file | --driver-memory | --driver-java-options)
;&

--driver-library-path | --driver-class-path | --executor-memory | --driver-cores)
;&

--total-executor-cores | --executor-cores | --queue | --num-executors | --archives)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any reason to spread this out over 4 cases? Why not just group them in 1? (You could use backslash to escape new line)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just to prevent too long a line. I tried escaping newline with backslash, at least it doesn't work in Bash 4.3.8 :(

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This code didn't work in bash 4.3.0 and 4.1.2 in CentOS6 and 3.2.51 in Mac OS X Maverics. I guess you use BSD right? I think, this is BSD specific issue.
I re-PRed #1825 and modified to use bask slash based multiline and that worked in bash 4.3.0 and 4.1.2 in CentOS6 and 3.2.51 in Mac OS X Maverics.

@liancheng can you check whether that works or not in 4.3.8?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let me make a little bit corrections.
This code works on bash 4.3.0 and 4.1.2 in CentOS6 but doesn't work on 3.2.51 in Mac OS X Maverics.

#1825 still works on 4.3.0 , 4.1.2 and 3.2.51.

If #1825 doesn't work 4.3.8, some work arounds may be needed.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm using Maverics with an brew installed bash 4.3.8. And yes #1825 works, thanks.

if [[ $# -lt 2 ]]; then
usage
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe I'm missing something, but where does usage come from?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

utils.sh expects scripts which use utils.sh implements usage() but it's implicit.
In #1825, the new commit modifies this issue.
Newer utils.sh forces scripts to implement usage function.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My fault, forgot to the usage function at first...

exit 1;
fi
SUBMISSION_OPTS+=($1); shift
SUBMISSION_OPTS+=($1); shift
;;

--verbose | -v | --supervise)
SUBMISSION_OPTS+=($1); shift
;;

*)
APPLICATION_OPTS+=($1); shift
;;
esac
done

export SUBMISSION_OPTS
export APPLICATION_OPTS
}
2 changes: 1 addition & 1 deletion python/pyspark/java_gateway.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ def launch_gateway():
submit_args = os.environ.get("PYSPARK_SUBMIT_ARGS")
submit_args = submit_args if submit_args is not None else ""
submit_args = shlex.split(submit_args)
command = [os.path.join(SPARK_HOME, script), "pyspark-shell"] + submit_args
command = [os.path.join(SPARK_HOME, script)] + submit_args + ["pyspark-shell"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see how application args for the pyspark shell are handled here. Is this still WIP?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could probably get them through os.environ.get("SPARK_APPLICATION_ARGS") here (or whatever you decide to call the environment variable)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also modified this in #1825 .

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently if we call something like bin/pyspark app.py --name "awesome app", bin/pyspark directly delegates to spark-submit and doesn't go down here. This implies that, submit_args can only contain spark-submit options.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@liancheng Ah yes, you're right, this doesn't actually do anything because the main class for the pyspark-shell is the py4j.JavaGateway, which is not interested in IPYTHON arguments like notebook.

if not on_windows:
# Don't send ctrl-c / SIGINT to the Java gateway:
def preexec_func():
Expand Down