-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-11780][SQL] Add type aliases backwards compatibility #10635
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Test build #48914 has finished for PR 10635 at commit
|
|
Test build #48925 has finished for PR 10635 at commit
|
|
retest this please |
|
Test build #48988 has finished for PR 10635 at commit
|
|
Does this actually let you use one source to compile against both versions of Spark? |
|
I could be missing something, but it seems like since you had to manually exclude things here to make it compile, users are going to run into problems. Can you actually make a program that uses these types and try and see if you can make it work with both Spark 1.5 and Spark 1.6? |
|
@marmbrus okay and I'll try it in a day. |
|
@marmbrus checked; Also, I checked that this pr resolved this issue, that is, we can compile them in both versions. |
|
I found that aggregation functions such as |
|
Test build #49355 has finished for PR 10635 at commit
|
|
I'm confused. The repo you link to compiles fine against Spark 1.6.0. |
|
Sorry that you get confused. |
Added type aliases in org.apache.spark.sql.types for classes moved to org.apache.spark.sql.catalyst.util.
|
@marmbrus ping |
|
Test build #49876 has finished for PR 10635 at commit
|
|
Okay, this seems good, but can you reopen it against branch-1.6? I don't think that we want to continue to offer compatibility once the major version changes in 2.0. |
|
@marmbrus Okay, I'll do that. |
Changed a target at branch-1.6 from #10635. Author: Takeshi YAMAMURO <[email protected]> Closes #10915 from maropu/pr9935-v3.
Rework from #9935 because it's stale.