-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-12629] [SparkR] Fixes for DataFrame saveAsTable method #10580
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Test build #48692 has finished for PR 10580 at commit
|
|
Test build #48697 has finished for PR 10580 at commit
|
|
I think the direction we are heading is to have multiple sqlContext sessions and be able to specify which one to use, instead of automatically picking one? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the same problem exists in write.df()? Could you extract the following logic into a util function, and update write.df() also to use it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See #9192 :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@felixcheung , @sun-rui , @shivaram : that's a good idea, I can use "getSqlContext()" method proposed in https://github.com/apache/spark/pull/9192/files#diff-b11442485f6b77bf47b58b4747321638R36
I guess I need to wait that pull request to go in and then I can merge it with mine .
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@felixcheung, it seems that you have it for saveAsTable too. After merge I can add my test cases ...
|
@felixcheung, as we discussed before, I don't think it makes sense to support multiple sessions in SparkR now. Let's keep the current way until there is need to support it. |
R/pkg/R/DataFrame.R
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please use df.write.saveAsTable() instead of deprecated df.saveAsTable().
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is done in #10584.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, let's merge with PR #10584 to unblock SparkR
|
I think we may need more discussion on #9192, so let's merge this first. |
|
Test build #48951 has finished for PR 10580 at commit
|
|
Test build #48952 has finished for PR 10580 at commit
|
|
I see. I've pushed all my changes and left the piece of code which retrieves the context in case source is not passed by the argument. Later, you can replace it with the getContext utility method. |
|
looks fine to me |
|
@NarineK, could you update the write.df() function as well? |
|
updated write.df and saveDF |
|
Test build #49840 has finished for PR 10580 at commit
|
|
Test build #49884 has finished for PR 10580 at commit
|
|
added ! Let me know what do you think! |
|
LGTM |
|
Merging this to master. |
|
@sun-rui @felixcheung I'm going to backport this to branch-1.6 so that the bug fix is available in 1.6.1. Let me know if you think there is a problem with it |
I've tried to solve some of the issues mentioned in: https://issues.apache.org/jira/browse/SPARK-12629 Please, let me know what do you think. Thanks! Author: Narine Kokhlikyan <[email protected]> Closes #10580 from NarineK/sparkrSavaAsRable. (cherry picked from commit 8a88e12) Signed-off-by: Shivaram Venkataraman <[email protected]>
I've tried to solve some of the issues mentioned in: https://issues.apache.org/jira/browse/SPARK-12629
Please, let me know what do you think.
Thanks!