-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-15116] In REPL we should create SparkSession first and get SparkContext from it #12890
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
cc @andrewor14 @rxin |
|
Are these not called only once? |
|
Test build #57732 has finished for PR 12890 at commit
|
|
yea, but look at the logic of In REPL, we create the |
|
retest this please |
|
Sorry I still don't understand what's going on by looking at your comment. One thing is that we are going to remove withHiveSupport, so adding it back isn't a "fix" per se. |
|
It seems like there is an underlying problem here. Can we fix that? |
|
Test build #57733 has finished for PR 12890 at commit
|
|
Test build #57742 has finished for PR 12890 at commit
|
|
Is the problem that {{val sparkContext = SparkContext.getOrCreate(sparkConf)}} will give us a {{sparkContext}} that already created by the repl and its conf does not |
| val builder = SparkSession.builder.config(conf) | ||
| if (SparkSession.hiveClassesArePresent) { | ||
| sparkSession = SparkSession.builder.enableHiveSupport().getOrCreate() | ||
| sparkSession = builder.enableHiveSupport().getOrCreate() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess we want to use builder.config("spark.sql.catalogImplementation", "hive")?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
that's what enableHiveSupport does?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh, right. We still have this method.
|
LGTM |
|
By the way I think the comment #12873 (comment) captures the point but is a little misleading. The issue here is that if there's already a |
|
It seems like we might need a little bit more of design here to figure out what the right behavior should be. Let's talk more offline. |
|
This patch itself has little to do with the design so let's just merge this first. |
…rkContext from it ## What changes were proposed in this pull request? see #12873 (comment). The problem is, if we create `SparkContext` first and then call `SparkSession.builder.enableHiveSupport().getOrCreate()`, we will reuse the existing `SparkContext` and the hive flag won't be set. ## How was this patch tested? verified it locally. Author: Wenchen Fan <[email protected]> Closes #12890 from cloud-fan/repl.
What changes were proposed in this pull request?
see #12873 (comment). The problem is, if we create
SparkContextfirst and then callSparkSession.builder.enableHiveSupport().getOrCreate(), we will reuse the existingSparkContextand the hive flag won't be set.How was this patch tested?
verified it locally.