Skip to content

Conversation

@nchammas
Copy link
Contributor

@nchammas nchammas commented Dec 2, 2015

I haven't created a JIRA. If we absolutely need one I'll do it, but I'm fine with not getting mentioned in the release notes if that's the only purpose it'll serve.

cc @marmbrus - We should include this in 1.6-RC2 if there is one. I can open a second PR against branch-1.6 if necessary.

ec2/spark_ec2.py Outdated
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure about this mapping. I'm just assuming that a version of Tachyon that works with 1.5.0 will also work with later versions. @haoyuan?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FWI we're now depending on Tachyon 0.8.2.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Starting with Spark 1.6.0?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

@marmbrus
Copy link
Contributor

marmbrus commented Dec 2, 2015

JIRA please. This is how we track things to make sure they get merged before RCs are cut.

@nchammas
Copy link
Contributor Author

nchammas commented Dec 2, 2015

👌

@nchammas nchammas changed the title [EC2] Update spark-ec2 versions [SPARK-12107] [EC2] Update spark-ec2 versions Dec 2, 2015
@nchammas
Copy link
Contributor Author

nchammas commented Dec 2, 2015

Done.

@SparkQA
Copy link

SparkQA commented Dec 2, 2015

Test build #47087 has finished for PR 10109 at commit b67767f.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Dec 3, 2015

Test build #47101 has finished for PR 10109 at commit 8da0bfe.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@shivaram
Copy link
Contributor

shivaram commented Dec 3, 2015

LGTM. Merging this. Thanks @nchammas

@shivaram
Copy link
Contributor

shivaram commented Dec 3, 2015

Well there is one issue here. The launch will fail for users on master, branch-1.6 until the 1.6.0 tar balls are available on S3. I am not sure there is a nice way around this.
One way around is to only merge this on branch-1.6 and keep the PR open and merge it on master after 1.6 is released.

Thoughts ?

@nchammas
Copy link
Contributor Author

nchammas commented Dec 3, 2015

What have we done in the past in this situation?

People running on master can get around this by manually specifying the Spark version with -v until 1.6.0 is published. Is that acceptable?

@shivaram
Copy link
Contributor

shivaram commented Dec 3, 2015

I think in the past our approach has been

  1. Not update the DEFAULT_SPARK_VERSION on master until the release is up
  2. During the release tag, @pwendell used to checkin a PR to update the DEFAULT_SPARK_VERSION on the release branch (kind of similar to how the maven pom versions are set).
  3. Also backport the PR from step 1 to branch-1.x after the release.

This is obviously very cumbersome (especially step 2 ?) and I'm not sure if @pwendell has any scripts which automate the release step especially.

@nchammas
Copy link
Contributor Author

nchammas commented Dec 3, 2015

If it helps simplify things, why don't we immediately merge in this change to both master and branch-1.6, and simply ask people to use -v to specify a different version of Spark until 1.6 is actually published?

I think it's reasonable to expect people running on the bleeding edge to deal with really minor annoyances like this if means making life easier for the maintainers.

@shivaram
Copy link
Contributor

shivaram commented Dec 3, 2015

Alright I'm fine with merging this since (a) we are pretty close to the release and (b) we should try to pull out spark_ec2.py from the apache repo to avoid this for the 2.0 release.

asfgit pushed a commit that referenced this pull request Dec 3, 2015
I haven't created a JIRA. If we absolutely need one I'll do it, but I'm fine with not getting mentioned in the release notes if that's the only purpose it'll serve.

cc marmbrus - We should include this in 1.6-RC2 if there is one. I can open a second PR against branch-1.6 if necessary.

Author: Nicholas Chammas <[email protected]>

Closes #10109 from nchammas/spark-ec2-versions.

(cherry picked from commit ad7cea6)
Signed-off-by: Shivaram Venkataraman <[email protected]>
@asfgit asfgit closed this in ad7cea6 Dec 3, 2015
@nchammas
Copy link
Contributor Author

nchammas commented Dec 3, 2015

👍

@nchammas nchammas deleted the spark-ec2-versions branch December 4, 2015 03:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants