Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 11 additions & 32 deletions docs/building-spark.md
Original file line number Diff line number Diff line change
Expand Up @@ -132,20 +132,6 @@ Thus, the full flow for running continuous-compilation of the `core` submodule m
$ cd core
$ ../build/mvn scala:cc

## Speeding up Compilation with Zinc

[Zinc](https://github.com/typesafehub/zinc) is a long-running server version of SBT's incremental
compiler. When run locally as a background process, it speeds up builds of Scala-based projects
like Spark. Developers who regularly recompile Spark with Maven will be the most interested in
Zinc. The project site gives instructions for building and running `zinc`; OS X users can
install it using `brew install zinc`.

If using the `build/mvn` package `zinc` will automatically be downloaded and leveraged for all
builds. This process will auto-start after the first time `build/mvn` is called and bind to port
3030 unless the `ZINC_PORT` environment variable is set. The `zinc` process can subsequently be
shut down at any time by running `build/zinc-<version>/bin/zinc -shutdown` and will automatically
restart whenever `build/mvn` is called.

## Building with SBT

Maven is the official build tool recommended for packaging Spark, and is the *build of reference*.
Expand All @@ -159,8 +145,14 @@ can be set to control the SBT build. For example:

To avoid the overhead of launching sbt each time you need to re-compile, you can launch sbt
in interactive mode by running `build/sbt`, and then run all build commands at the command
prompt. For more recommendations on reducing build time, refer to the
[Useful Developer Tools page](http://spark.apache.org/developer-tools.html).
prompt.

## Speeding up Compilation

Developers who compile Spark frequently may want to speed up compilation; e.g., by using Zinc
(for developers who build with Maven) or by avoiding re-compilation of the assembly JAR (for
developers who build with SBT). For more information about how to do this, refer to the
[Useful Developer Tools page](http://spark.apache.org/developer-tools.html#reducing-build-times).

## Encrypted Filesystems

Expand Down Expand Up @@ -190,29 +182,16 @@ The following is an example of a command to run the tests:

./build/mvn test

The ScalaTest plugin also supports running only a specific Scala test suite as follows:

./build/mvn -P... -Dtest=none -DwildcardSuites=org.apache.spark.repl.ReplSuite test
./build/mvn -P... -Dtest=none -DwildcardSuites=org.apache.spark.repl.* test

or a Java test:

./build/mvn test -P... -DwildcardSuites=none -Dtest=org.apache.spark.streaming.JavaAPISuite

## Testing with SBT

The following is an example of a command to run the tests:

./build/sbt test

To run only a specific test suite as follows:

./build/sbt "test-only org.apache.spark.repl.ReplSuite"
./build/sbt "test-only org.apache.spark.repl.*"

To run test suites of a specific sub project as follows:
## Running Individual Tests

./build/sbt core/test
For information about how to run individual tests, refer to the
[Useful Developer Tools page](http://spark.apache.org/developer-tools.html#running-individual-tests).

## PySpark pip installable

Expand Down