Skip to content

Commit 61b4988

Browse files
committed
Document that Java 7, Python 2.6, Scala 2.10, Hadoop < 2.6 are deprecated in Spark 2.1.0.
1 parent 9c8deef commit 61b4988

File tree

3 files changed

+13
-0
lines changed

3 files changed

+13
-0
lines changed

docs/building-spark.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,7 @@ redirect_from: "building-with-maven.html"
1313

1414
The Maven-based build is the build of reference for Apache Spark.
1515
Building Spark using Maven requires Maven 3.3.9 or newer and Java 7+.
16+
Note that support for Java 7 is deprecated as of Spark 2.1.0 and may be removed in Spark 2.2.0.
1617

1718
### Setting up Maven's Memory Usage
1819

@@ -79,6 +80,9 @@ Because HDFS is not protocol-compatible across versions, if you want to read fro
7980
</tbody>
8081
</table>
8182

83+
Note that support for versions of Hadoop before 2.6 are deprecated as of Spark 2.1.0 and may be
84+
removed in Spark 2.2.0.
85+
8286

8387
You can enable the `yarn` profile and optionally set the `yarn.version` property if it is different from `hadoop.version`. Spark only supports YARN versions 2.2.0 and later.
8488

@@ -129,6 +133,8 @@ To produce a Spark package compiled with Scala 2.10, use the `-Dscala-2.10` prop
129133

130134
./dev/change-scala-version.sh 2.10
131135
./build/mvn -Pyarn -Phadoop-2.4 -Dscala-2.10 -DskipTests clean package
136+
137+
Note that support for Scala 2.10 is deprecated as of Spark 2.1.0 and may be removed in Spark 2.2.0.
132138

133139
## Building submodules individually
134140

docs/index.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,9 @@ Spark runs on Java 7+, Python 2.6+/3.4+ and R 3.1+. For the Scala API, Spark {{s
2828
uses Scala {{site.SCALA_BINARY_VERSION}}. You will need to use a compatible Scala version
2929
({{site.SCALA_BINARY_VERSION}}.x).
3030

31+
Note that support for Java 7, Python 2.6, Scala 2.10 and version of Hadoop before 2.6 are
32+
deprecated as of Spark 2.1.0, and may be removed in Spark 2.2.0.
33+
3134
# Running the Examples and Shell
3235

3336
Spark comes with several sample programs. Scala, Java, Python and R examples are in the

docs/programming-guide.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -59,6 +59,8 @@ Spark {{site.SPARK_VERSION}} works with Java 7 and higher. If you are using Java
5959
for concisely writing functions, otherwise you can use the classes in the
6060
[org.apache.spark.api.java.function](api/java/index.html?org/apache/spark/api/java/function/package-summary.html) package.
6161

62+
Note that support for Java 7 is deprecated as of Spark 2.1.0 and may be removed in Spark 2.2.0.
63+
6264
To write a Spark application in Java, you need to add a dependency on Spark. Spark is available through Maven Central at:
6365

6466
groupId = org.apache.spark
@@ -87,6 +89,8 @@ import org.apache.spark.SparkConf
8789
Spark {{site.SPARK_VERSION}} works with Python 2.6+ or Python 3.4+. It can use the standard CPython interpreter,
8890
so C libraries like NumPy can be used. It also works with PyPy 2.3+.
8991

92+
Note that support for Python 2.6 is deprecated as of Spark 2.1.0, and may be removed in Spark 2.2.0.
93+
9094
To run Spark applications in Python, use the `bin/spark-submit` script located in the Spark directory.
9195
This script will load Spark's Java/Scala libraries and allow you to submit applications to a cluster.
9296
You can also use `bin/pyspark` to launch an interactive Python shell.

0 commit comments

Comments
 (0)