Skip to content

Commit e2c2312

Browse files
committed
Merge in changes in #752 (SPARK-1814)
1 parent 25cfe7b commit e2c2312

File tree

1 file changed

+13
-6
lines changed

1 file changed

+13
-6
lines changed

docs/index.md

Lines changed: 13 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -24,20 +24,27 @@ right version of Scala from [scala-lang.org](http://www.scala-lang.org/download/
2424

2525
# Running the Examples and Shell
2626

27-
Spark comes with several sample programs. Scala, Java and Python examples are in the `examples/src/main` directory.
28-
To run one of the Java or Scala sample programs, use `bin/run-example <class> [params]` in the top-level Spark directory. For example,
27+
Spark comes with several sample programs. Scala, Java and Python examples are in the
28+
`examples/src/main` directory. To run one of the Java or Scala sample programs, use
29+
`bin/run-example <class> [params]` in the top-level Spark directory. (Behind the scenes, this
30+
invokes the more general
31+
[Spark submit script](cluster-overview.html#launching-applications-with-spark-submit) for
32+
launching applications). For example,
2933

3034
./bin/run-example SparkPi 10
3135

32-
You can also run Spark interactively through modified versions of the Scala shell. This is a great way to learn the framework.
36+
You can also run Spark interactively through modified versions of the Scala shell. This is a
37+
great way to learn the framework.
3338

3439
./bin/spark-shell --master local[2]
3540

36-
The `--master` option specifies the [master URL for a distributed cluster](scala-programming-guide.html#master-urls),
37-
or `local` to run locally with one thread, or `local[N]` to run locally with N threads. You should start by using
41+
The `--master` option specifies the
42+
[master URL for a distributed cluster](scala-programming-guide.html#master-urls), or `local` to run
43+
locally with one thread, or `local[N]` to run locally with N threads. You should start by using
3844
`local` for testing. For a full list of options, run Spark shell with the `--help` option.
3945

40-
If Scala is not your cup of tea, you can also try out Spark using the python interface with `bin/pyspark <program> [params]`. For example,
46+
Spark also provides a Python interface. To run an example Spark application written in Python, use
47+
`bin/pyspark <program> [params]`. For example,
4148

4249
./bin/pyspark examples/src/main/python/pi.py local[2] 10
4350

0 commit comments

Comments
 (0)