Skip to content

Conversation

@scwf
Copy link
Contributor

@scwf scwf commented Sep 23, 2014

Now spark sql hive version is 0.12.0 and do not support 0.13.1 because of some api level changes in hive new version.
Since hive has backwards compatibility, this PR just upgrade the hive version to 0.13.1(compile this PR against 0.12.0 will get error), i think this is ok for users and we also do not need to support different version of hive .

Notes:

  1. package cmd not changed, sbt/sbt -Phive assembly will get the assembly jar with hive 0.13.1
  2. this PR use org.apache.hive since there is not a shaded one of org.spark-project.hive for 0.13.1
  3. i regenerate golden answer since change of sql query result

@SparkQA
Copy link

SparkQA commented Sep 23, 2014

Can one of the admins verify this patch?

@SparkQA
Copy link

SparkQA commented Sep 23, 2014

QA tests have started for PR 2499 at commit 6d5d071.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Sep 23, 2014

QA tests have finished for PR 2499 at commit 6d5d071.

  • This patch fails unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@scwf
Copy link
Contributor Author

scwf commented Sep 29, 2014

@marmbrus, can you test this?

@marmbrus
Copy link
Contributor

Hi, thanks for working on this. Unfortunately, we can't just upgrade to Hive 0.13.0 as that would break users who are running 0.12.0. Instead, there is already a PR open to support both versions at the same time using a shim layer #2241. It would be great if you could make any comments on the approach there.

@scwf
Copy link
Contributor Author

scwf commented Sep 30, 2014

Hi , @marmbrus, thanks for your reply. I did not understand the break when upgrade to hive 0.13, in my understanding that sql syntax of hive 0.13 is compatible with 0.12, so for users there is no change. Maybe i am wrong understanding?

@marmbrus
Copy link
Contributor

The problem is metastore compatibility. Hive 0.13.0 cannot talk to a 0.12.0 metastore. For this reason we'll want to be able to support both in Spark.

@scwf
Copy link
Contributor Author

scwf commented Sep 30, 2014

Get it, i see in #2241 hive-thriftserver is not enabled, maybe i can make a new PR to cover hive-thriftserver based on this PR(already upgrade hive version of thriftserver to 0.13.1), how do you think?

@marmbrus
Copy link
Contributor

It would be great to support the thrift server for both 0.12.0 and 0.13.0. Please discuss your design with @liancheng .

@scwf
Copy link
Contributor Author

scwf commented Sep 30, 2014

ok, thanks. @liancheng, we can refer to the method of #2241 to provide a shim layer for hive-thriftserver to support 0.12 and 0.13, how about your idea?

@liancheng
Copy link
Contributor

@scwf A shim layer seems reasonable if we can make clean abstractions. A major issue is that the original HiveServer/HiveServer2 were not designed to be extended by other applications, that's why we have to use reflection tricks to implement HiveThriftServer2 and SparkSQLCLIDriver.

@scwf
Copy link
Contributor Author

scwf commented Sep 30, 2014

ok, i will have a try to use a shim layer to implement this.

@marmbrus
Copy link
Contributor

marmbrus commented Oct 1, 2014

Would it be okay to close this issue for now, and reopen it when you have a draft of the hive server? Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants