@@ -132,20 +132,6 @@ Thus, the full flow for running continuous-compilation of the `core` submodule m
132132 $ cd core
133133 $ ../build/mvn scala:cc
134134
135- ## Speeding up Compilation with Zinc
136-
137- [ Zinc] ( https://github.com/typesafehub/zinc ) is a long-running server version of SBT's incremental
138- compiler. When run locally as a background process, it speeds up builds of Scala-based projects
139- like Spark. Developers who regularly recompile Spark with Maven will be the most interested in
140- Zinc. The project site gives instructions for building and running ` zinc ` ; OS X users can
141- install it using ` brew install zinc ` .
142-
143- If using the ` build/mvn ` package ` zinc ` will automatically be downloaded and leveraged for all
144- builds. This process will auto-start after the first time ` build/mvn ` is called and bind to port
145- 3030 unless the ` ZINC_PORT ` environment variable is set. The ` zinc ` process can subsequently be
146- shut down at any time by running ` build/zinc-<version>/bin/zinc -shutdown ` and will automatically
147- restart whenever ` build/mvn ` is called.
148-
149135## Building with SBT
150136
151137Maven is the official build tool recommended for packaging Spark, and is the * build of reference* .
@@ -159,8 +145,14 @@ can be set to control the SBT build. For example:
159145
160146To avoid the overhead of launching sbt each time you need to re-compile, you can launch sbt
161147in interactive mode by running ` build/sbt ` , and then run all build commands at the command
162- prompt. For more recommendations on reducing build time, refer to the
163- [ Useful Developer Tools page] ( http://spark.apache.org/developer-tools.html ) .
148+ prompt.
149+
150+ ## Speeding up Compilation
151+
152+ Developers who compile Spark frequently may want to speed up compilation; e.g., by using Zinc
153+ (for developers who build with Maven) or by avoiding re-compilation of the assembly JAR (for
154+ developers who build with SBT). For more information about how to do this, refer to the
155+ [ Useful Developer Tools page] ( http://spark.apache.org/developer-tools.html#reducing-build-times ) .
164156
165157## Encrypted Filesystems
166158
@@ -190,29 +182,16 @@ The following is an example of a command to run the tests:
190182
191183 ./build/mvn test
192184
193- The ScalaTest plugin also supports running only a specific Scala test suite as follows:
194-
195- ./build/mvn -P... -Dtest=none -DwildcardSuites=org.apache.spark.repl.ReplSuite test
196- ./build/mvn -P... -Dtest=none -DwildcardSuites=org.apache.spark.repl.* test
197-
198- or a Java test:
199-
200- ./build/mvn test -P... -DwildcardSuites=none -Dtest=org.apache.spark.streaming.JavaAPISuite
201-
202185## Testing with SBT
203186
204187The following is an example of a command to run the tests:
205188
206189 ./build/sbt test
207190
208- To run only a specific test suite as follows:
209-
210- ./build/sbt "test-only org.apache.spark.repl.ReplSuite"
211- ./build/sbt "test-only org.apache.spark.repl.*"
212-
213- To run test suites of a specific sub project as follows:
191+ ## Running Individual Tests
214192
215- ./build/sbt core/test
193+ For information about how to run individual tests, refer to the
194+ [ Useful Developer Tools page] ( http://spark.apache.org/developer-tools.html#running-individual-tests ) .
216195
217196## PySpark pip installable
218197
0 commit comments