Skip to content

Conversation

@briuri
Copy link

@briuri briuri commented Jun 28, 2016

What changes were proposed in this pull request?

  • Adds 1.6.2 and 1.6.3 as supported Spark versions within the bundled spark-ec2 script.
  • Makes the default Spark version 1.6.3 to keep in sync with the upcoming release.
  • Does not touch the newer spark-ec2 scripts in the separate amplabs repository.

How was this patch tested?

  • Manual script execution:

export AWS_SECRET_ACCESS_KEY=snip
export AWS_ACCESS_KEY_ID=snip
$SPARK_HOME/ec2/spark-ec2
--key-pair=snip
--identity-file=snip
--region=us-east-1
--vpc-id=snip
--slaves=1
--instance-type=t1.micro
--spark-version=1.6.2
--hadoop-major-version=yarn
launch test-cluster

  • Result: Successful creation of a 1.6.2-based Spark cluster.

This contribution is my original work and I license the work to the project under the project's open source license.

@srowen
Copy link
Member

srowen commented Jun 28, 2016

LGTM

@srowen
Copy link
Member

srowen commented Jun 28, 2016

Jenkins test this please

@SparkQA
Copy link

SparkQA commented Jun 28, 2016

Test build #61393 has finished for PR 13947 at commit 6b86b69.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

asfgit pushed a commit that referenced this pull request Jun 30, 2016
….6.3.

## What changes were proposed in this pull request?

- Adds 1.6.2 and 1.6.3 as supported Spark versions within the bundled spark-ec2 script.
- Makes the default Spark version 1.6.3 to keep in sync with the upcoming release.
- Does not touch the newer spark-ec2 scripts in the separate amplabs repository.

## How was this patch tested?

- Manual script execution:

export AWS_SECRET_ACCESS_KEY=_snip_
export AWS_ACCESS_KEY_ID=_snip_
$SPARK_HOME/ec2/spark-ec2 \
    --key-pair=_snip_ \
    --identity-file=_snip_ \
    --region=us-east-1 \
    --vpc-id=_snip_ \
    --slaves=1 \
    --instance-type=t1.micro \
    --spark-version=1.6.2 \
    --hadoop-major-version=yarn \
    launch test-cluster

- Result: Successful creation of a 1.6.2-based Spark cluster.

This contribution is my original work and I license the work to the project under the project's open source license.

Author: Brian Uri <[email protected]>

Closes #13947 from briuri/branch-1.6-bug-spark-16257.
@srowen
Copy link
Member

srowen commented Jun 30, 2016

Merged to 1.6

@srowen
Copy link
Member

srowen commented Jun 30, 2016

@briuri thank you, and the way the ASF git bot works, it can't auto-close this PR. Can you close it? it's merged now.

@briuri briuri closed this Jun 30, 2016
zzcclp pushed a commit to zzcclp/spark that referenced this pull request Jul 1, 2016
….6.3.

## What changes were proposed in this pull request?

- Adds 1.6.2 and 1.6.3 as supported Spark versions within the bundled spark-ec2 script.
- Makes the default Spark version 1.6.3 to keep in sync with the upcoming release.
- Does not touch the newer spark-ec2 scripts in the separate amplabs repository.

## How was this patch tested?

- Manual script execution:

export AWS_SECRET_ACCESS_KEY=_snip_
export AWS_ACCESS_KEY_ID=_snip_
$SPARK_HOME/ec2/spark-ec2 \
    --key-pair=_snip_ \
    --identity-file=_snip_ \
    --region=us-east-1 \
    --vpc-id=_snip_ \
    --slaves=1 \
    --instance-type=t1.micro \
    --spark-version=1.6.2 \
    --hadoop-major-version=yarn \
    launch test-cluster

- Result: Successful creation of a 1.6.2-based Spark cluster.

This contribution is my original work and I license the work to the project under the project's open source license.

Author: Brian Uri <[email protected]>

Closes apache#13947 from briuri/branch-1.6-bug-spark-16257.

(cherry picked from commit ccc7fa3)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants