-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-18138][DOCS] Document that Java 7, Python 2.6, Scala 2.10, Hadoop < 2.6 are deprecated in Spark 2.1.0 #15733
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…ated in Spark 2.1.0.
|
Test build #67980 has finished for PR 15733 at commit
|
nchammas
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Java 7 and Python 2.6 support have both been deprecated since 2.0. Dunno if you wanna update the text to be more precise in this regard.
docs/building-spark.md
Outdated
|
|
||
| The Maven-based build is the build of reference for Apache Spark. | ||
| Building Spark using Maven requires Maven 3.3.9 or newer and Java 7+. | ||
| Note that support for Java 7 is deprecated as of Spark 2.1.0 and may be removed in Spark 2.2.0. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe it's been deprecated since 2.0.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good point, will adjust this.
|
Test build #67988 has finished for PR 15733 at commit
|
docs/index.md
Outdated
| Note that support for Java 7, Python 2.6, Scala 2.10 and version of Hadoop before 2.6 are | ||
| deprecated as of Spark 2.1.0, and may be removed in Spark 2.2.0. | ||
| Note that support for Java 7 and Python 2.6 are deprecated as of Spark 2.0.0, and support for | ||
| Scala 2.10 and version of Hadoop before 2.6 are deprecated as of Spark 2.1.0, and may be |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"... and versions of Hadoop..."
|
Test build #67992 has finished for PR 15733 at commit
|
|
Can you update SparkContext to print deprecation warning when these environments are used? |
|
I added some warnings -- examples -- I found an API for Hadoop version info but it's labeled "private" and "unstable", so wasn't sure whether it's worth it to access it just to warn. |
|
Test build #68012 has finished for PR 15733 at commit
|
nchammas
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If a PySpark user is using Python 2.7+ and Java 7, will they see the warning about Java 7?
I think the answer is yes, but I just want to make sure.
|
LGTM pending tests. |
|
Test build #68013 has finished for PR 15733 at commit
|
|
Test build #3408 has finished for PR 15733 at commit
|
|
Merging in master/branch-2.1. |
…oop < 2.6 are deprecated in Spark 2.1.0 ## What changes were proposed in this pull request? Document that Java 7, Python 2.6, Scala 2.10, Hadoop < 2.6 are deprecated in Spark 2.1.0. This does not actually implement any of the change in SPARK-18138, just peppers the documentation with notices about it. ## How was this patch tested? Doc build Author: Sean Owen <[email protected]> Closes #15733 from srowen/SPARK-18138. (cherry picked from commit dc4c600) Signed-off-by: Reynold Xin <[email protected]>
…oop < 2.6 are deprecated in Spark 2.1.0 ## What changes were proposed in this pull request? Document that Java 7, Python 2.6, Scala 2.10, Hadoop < 2.6 are deprecated in Spark 2.1.0. This does not actually implement any of the change in SPARK-18138, just peppers the documentation with notices about it. ## How was this patch tested? Doc build Author: Sean Owen <[email protected]> Closes apache#15733 from srowen/SPARK-18138.
What changes were proposed in this pull request?
Document that Java 7, Python 2.6, Scala 2.10, Hadoop < 2.6 are deprecated in Spark 2.1.0. This does not actually implement any of the change in SPARK-18138, just peppers the documentation with notices about it.
How was this patch tested?
Doc build