diff --git a/docs/building-spark.md b/docs/building-spark.md
index 56b892696ee2c66c3076757697a0a5b86c0f2306..8353b7a520b8ed713500e6421e3c95f0a233ac35 100644
--- a/docs/building-spark.md
+++ b/docs/building-spark.md
@@ -132,20 +132,6 @@ Thus, the full flow for running continuous-compilation of the `core` submodule m
     $ cd core
     $ ../build/mvn scala:cc
 
-## Speeding up Compilation with Zinc
-
-[Zinc](https://github.com/typesafehub/zinc) is a long-running server version of SBT's incremental
-compiler. When run locally as a background process, it speeds up builds of Scala-based projects
-like Spark. Developers who regularly recompile Spark with Maven will be the most interested in
-Zinc. The project site gives instructions for building and running `zinc`; OS X users can
-install it using `brew install zinc`.
-
-If using the `build/mvn` package `zinc` will automatically be downloaded and leveraged for all
-builds. This process will auto-start after the first time `build/mvn` is called and bind to port
-3030 unless the `ZINC_PORT` environment variable is set. The `zinc` process can subsequently be
-shut down at any time by running `build/zinc-<version>/bin/zinc -shutdown` and will automatically
-restart whenever `build/mvn` is called.
-
 ## Building with SBT
 
 Maven is the official build tool recommended for packaging Spark, and is the *build of reference*.
@@ -159,8 +145,14 @@ can be set to control the SBT build. For example:
 
 To avoid the overhead of launching sbt each time you need to re-compile, you can launch sbt
 in interactive mode by running `build/sbt`, and then run all build commands at the command
-prompt. For more recommendations on reducing build time, refer to the
-[Useful Developer Tools page](http://spark.apache.org/developer-tools.html).
+prompt.
+
+## Speeding up Compilation
+
+Developers who compile Spark frequently may want to speed up compilation; e.g., by using Zinc
+(for developers who build with Maven) or by avoiding re-compilation of the assembly JAR (for
+developers who build with SBT).  For more information about how to do this, refer to the
+[Useful Developer Tools page](http://spark.apache.org/developer-tools.html#reducing-build-times).
 
 ## Encrypted Filesystems
 
@@ -190,29 +182,16 @@ The following is an example of a command to run the tests:
 
     ./build/mvn test
 
-The ScalaTest plugin also supports running only a specific Scala test suite as follows:
-
-    ./build/mvn -P... -Dtest=none -DwildcardSuites=org.apache.spark.repl.ReplSuite test
-    ./build/mvn -P... -Dtest=none -DwildcardSuites=org.apache.spark.repl.* test
-
-or a Java test:
-
-    ./build/mvn test -P... -DwildcardSuites=none -Dtest=org.apache.spark.streaming.JavaAPISuite
-
 ## Testing with SBT
 
 The following is an example of a command to run the tests:
 
     ./build/sbt test
 
-To run only a specific test suite as follows:
-
-    ./build/sbt "test-only org.apache.spark.repl.ReplSuite"
-    ./build/sbt "test-only org.apache.spark.repl.*"
-
-To run test suites of a specific sub project as follows:
+## Running Individual Tests
 
-    ./build/sbt core/test
+For information about how to run individual tests, refer to the
+[Useful Developer Tools page](http://spark.apache.org/developer-tools.html#running-individual-tests).
 
 ## PySpark pip installable