You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[SPARK-16257][BUILD] Update spark_ec2.py to support Spark 1.6.2 and 1.6.3.
## What changes were proposed in this pull request?
- Adds 1.6.2 and 1.6.3 as supported Spark versions within the bundled spark-ec2 script.
- Makes the default Spark version 1.6.3 to keep in sync with the upcoming release.
- Does not touch the newer spark-ec2 scripts in the separate amplabs repository.
## How was this patch tested?
- Manual script execution:
export AWS_SECRET_ACCESS_KEY=_snip_
export AWS_ACCESS_KEY_ID=_snip_
$SPARK_HOME/ec2/spark-ec2 \
--key-pair=_snip_ \
--identity-file=_snip_ \
--region=us-east-1 \
--vpc-id=_snip_ \
--slaves=1 \
--instance-type=t1.micro \
--spark-version=1.6.2 \
--hadoop-major-version=yarn \
launch test-cluster
- Result: Successful creation of a 1.6.2-based Spark cluster.
This contribution is my original work and I license the work to the project under the project's open source license.
Author: Brian Uri <[email protected]>
Closesapache#13947 from briuri/branch-1.6-bug-spark-16257.
(cherry picked from commit ccc7fa3)
0 commit comments