Skip to content

Conversation

@JoshRosen
Copy link
Contributor

This addresses a PySpark issue where a failed attempt to construct SparkContext would prevent any future SparkContext creation.

@SparkQA
Copy link

SparkQA commented Jul 27, 2014

QA tests have started for PR 1606. This patch merges cleanly.
View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17232/consoleFull

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because we keep a reference to this object in SparkContext._active_spark_context, this method never gets called except when cleaning up after a SparkContext creation attempt that failed because another context was already running. In that case, the call to sc.stop() clears SparkContext._active_spark_context and we lose track of the active context, which can allow the creation of multiple running contexts.

@SparkQA
Copy link

SparkQA commented Jul 27, 2014

QA results for PR 1606:
- This patch PASSES unit tests.
- This patch merges cleanly
- This patch adds no public classes

For more information see test ouptut:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17232/consoleFull

@mattf
Copy link

mattf commented Jul 27, 2014

+1 lgtm

i've been hitting this issue repeatedly, but assumed it was a corner case that wouldn't get much attention. my assumption is that use of spark-submit and pyspark are the primary ways to get a context.

@mateiz
Copy link
Contributor

mateiz commented Jul 28, 2014

Thanks Josh, merged this.

@asfgit asfgit closed this in a7d145e Jul 28, 2014
xiliu82 pushed a commit to xiliu82/spark that referenced this pull request Sep 4, 2014
This addresses a PySpark issue where a failed attempt to construct SparkContext would prevent any future SparkContext creation.

Author: Josh Rosen <[email protected]>

Closes apache#1606 from JoshRosen/SPARK-1550 and squashes the following commits:

ec7fadc [Josh Rosen] [SPARK-1550] [PySpark] Allow SparkContext creation after failed attempts
sunchao pushed a commit to sunchao/spark that referenced this pull request Jun 2, 2023
(cherry picked from commit d6a02ec59cd1997858a2e63e85e834361a769dc3)
Signed-off-by: Dongjoon Hyun <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants