Skip to content

Conversation

@sunchao
Copy link
Member

@sunchao sunchao commented Sep 14, 2021

What changes were proposed in this pull request?

This PR creates a new module hive-shaded which shades & relocates various dependencies from Hive, including Guava. By this means it also upgrades the Guava version from 14.0.1 to 30.1.1-jre.

Why are the changes needed?

Spark currently use hive-exec-core which leaks a lot of dependencies to Spark, in particular Guava. As consequence, Spark is stuck with an ancient Guava version 14.0.1 which also carries CVE issues described in SPARK-32502.

By creating a shaded module, Spark is able to de-couple from those dependencies leaked by Hive and upgrade to newer versions of Guava. This also allows us to upgrade other dependencies whenever necessary, without having to wait for new Hive releases.

Does this PR introduce any user-facing change?

No. For Spark users who programmatically depending on modules such as spark-hive, the Hive dependencies will be replaced by the newly created hive-shaded module.

How was this patch tested?

Existing tests.

@dongjoon-hyun
Copy link
Member

Thank you, @sunchao .

cc @cloud-fan , @HyukjinKwon , @gengliangwang , @wangyum

@dongjoon-hyun dongjoon-hyun changed the title [SPARK-36676][SQL] Create shaded Hive module and upgrade Guava version to 30.1.1-jre [SPARK-36676][SQL][BUILD] Create shaded Hive module and upgrade Guava version to 30.1.1-jre Sep 14, 2021
@SparkQA
Copy link

SparkQA commented Sep 14, 2021

Kubernetes integration test unable to build dist.

exiting with code: 1
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/47746/

@SparkQA
Copy link

SparkQA commented Sep 14, 2021

Test build #143243 has finished for PR 33989 at commit d5a6f2c.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since we use shaded guava now in both hive, hadoop, spark, can we remove this dep entirely?

Copy link
Member Author

@sunchao sunchao Sep 14, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It actually might be possible: seems Spark now shade Guava usage everywhere within itself now. I remember seeing a few modules which are not using shaded version but now can't find them anymore. Perhaps some mistake from my side.

It's a bit strange that Spark packs the shaded Guava classes in its spark-network-common jar. This is working since it is required by spark-core and thus transitively required by other modules.

I need to fix the PR since the "inherited" tag will prevent Spark from shading jetty & guava in other modules, which is bad. Thinking about how to do proper plugin inheritance in Maven...

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Curator has a hard dependency on Guava, see detail at https://cwiki.apache.org/confluence/display/CURATOR/TN13

Copy link
Member Author

@sunchao sunchao Sep 15, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm you are right, I missed this. Since Spark declares Guava as provided dependency, mvn dependency:tree won't show the guava usages in Spark's transitive dependencies.

So sadly, seems we still have to include an unshaded Guava jar in Spark's distribution.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So build/sbt dependencyTree provides a better result, and it looks like only curator-client and hive-exec has hard dependency on Guava at the moment. With the hive-shaded module, the latter is resolved so we are only left with the former, which makes me thinking whether we can just skip relocating the following 3 classes required by curator:

  • com.google.common.base.Function
  • com.google.common.base.Predicate
  • com.google.common.reflect.TypeToken

It's not ideal but is better than exposing all the Guava classes.

Copy link
Member

@pan3793 pan3793 Sep 16, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https://guava.dev/

APIs without @beta will remain binary-compatible for the indefinite future. (Previously, we sometimes removed such APIs after a deprecation period. The last release to remove non-@beta APIs was Guava 21.0.) Even @deprecated APIs will remain (again, unless they are @beta). We have no plans to start removing things again, but officially, we’re leaving our options open in case of surprises (like, say, a serious security problem).

Except hive-shaded, I think it's fine to 1)use vanilla Guava jar directly, 2) avoid using Guava @beta APIs in Spark, 3)keep Guava in the latest version.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the reference! this is good from Spark side but it won't help if downstream apps use beta API which were changed and thus causing conflicts. I agree with you that we should avoid using these beta APIs - perhaps worth adding a maven plugin to check this.

In the mean time I'll leave the guava dependency in the assembly module and we should revisit it separately. Ideally I feel we should remove it if only curator-client is blocking it but we'll need to double check on that.

@sunchao sunchao force-pushed the SPARK-36676-shade-hive branch from d5a6f2c to 4624325 Compare September 14, 2021 22:10
@SparkQA
Copy link

SparkQA commented Sep 14, 2021

Kubernetes integration test unable to build dist.

exiting with code: 1
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/47784/

@SparkQA
Copy link

SparkQA commented Sep 15, 2021

Test build #143281 has finished for PR 33989 at commit 4624325.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@dbtsai dbtsai self-requested a review September 15, 2021 00:51
Copy link
Member

@dbtsai dbtsai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks!

@pan3793
Copy link
Member

pan3793 commented Sep 15, 2021

If I understand correctly, the hive-shaded module creates a fat jar that contains all hive classes and the classes from transitive dependencies of hive, but only a few classes are relocated. (Correct me if I'm wrong)

That means those classes which are not relocated will cause class conflict if the user also provides the original jars, e.g. HikariCP.

For this case, maybe we can learn something from https://github.com/trinodb/trino-hive-apache

@sunchao
Copy link
Member Author

sunchao commented Sep 15, 2021

@pan3793 yes that's correct, I'm not relocating all the dependencies because the others haven't caused any issue so far AFAIK (after all, Spark today already leaks these dependencies). We can definitely include them if necessary, just need to check if they are used in APIs and whether that causes any breakage.

@pan3793
Copy link
Member

pan3793 commented Sep 15, 2021

Spark today already leaks these dependencies

Yeah, but users can see what jars in $SPARK_HOME/jars and 1) align dependencies' version with Spark when build Spark application, 2) exclude jars already under $SPARK_HOME/jars when package or submit Spark job.

For hive-shaded, the user has no idea which jars should be aligned and excluded.

I think hive-shaded should relocate all classes except org.apache.hive, which makes things simple, the user just needs to exclude all hive vanilla jars.

@sunchao
Copy link
Member Author

sunchao commented Sep 15, 2021

Makes sense. Let me try to relocate those too.

@sunchao sunchao force-pushed the SPARK-36676-shade-hive branch from 4624325 to dd28c5b Compare September 16, 2021 20:44
@SparkQA
Copy link

SparkQA commented Sep 16, 2021

Kubernetes integration test unable to build dist.

exiting with code: 1
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/47872/

@SparkQA
Copy link

SparkQA commented Sep 16, 2021

Kubernetes integration test unable to build dist.

exiting with code: 1
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/47873/

@SparkQA
Copy link

SparkQA commented Sep 16, 2021

Test build #143365 has finished for PR 33989 at commit dd28c5b.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Sep 16, 2021

Test build #143366 has finished for PR 33989 at commit c8e5764.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@HyukjinKwon
Copy link
Member

cc @srowen too FYI

@SparkQA
Copy link

SparkQA commented Sep 17, 2021

Kubernetes integration test unable to build dist.

exiting with code: 1
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/47878/

@SparkQA
Copy link

SparkQA commented Sep 17, 2021

Test build #143371 has finished for PR 33989 at commit aaac03a.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@sunchao sunchao marked this pull request as ready for review September 17, 2021 15:52
Copy link
Member

@pan3793 pan3793 Sep 20, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can the property ${hive.group} be replaced with literal org.apache.hive? I also see there are <groupId>org.apache.hive</groupId> somewhere.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You mean replace org.apache.hive with ${hive.group}? yes we should do it, although I just copied these verbatim from the existing hive/pom.xml :)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Which value will be assigned to variable ${hive.group} except org.apache.hive? If no other options, we can remove ${hive.group} and just use literal org.apache.hive.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm fine with either way - don't see much difference between the two.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This means that the shaded jar will be different when packaging w/ or w/o -Phive-thriftserver, which one will be published to maven central?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe we always publish binaries with -Phive-thriftserver, see here.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it. Thanks for the explanation.

Copy link
Member

@pan3793 pan3793 Sep 20, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

javax.jdo, javax.realtime, javax.transaction, javax.xml, org.json still exist in the spark-hive-shaded jar with the original package name.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oops forgot these. I'm thinking to add these javax.* dependencies explicitly in spark-hive and exclude them from the uber jar, since they seem relatively stable. What do you think?

On the other hand, seems we should exclude org.json from Hive due to SPARK-18262.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It includes some log conf or other conf files into shaded jar, like bonecp-default-config.xml, hive-exec-log4j2.properties, hive-log4j2.properties, package.jdo, parquet-logging.properties, testpool.jocl, tez-container-log4j2.properties, does spark really need them?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. Let me check on those.

@SparkQA
Copy link

SparkQA commented Sep 24, 2021

Test build #143610 has finished for PR 33989 at commit c2ec1f8.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@JoshRosen
Copy link
Contributor

A cross-reference for other reviewers:

Given that hive-exec shades Guava in Hive 2.3.8+ (apache/hive#1356), I was initially confused about why we needed to do our own shading in this PR: I originally thought that it was done to shade a broader set of dependencies beyond just Guava, further isolating us from future dependency conflicts. As @viirya points out at #29326 (comment), though, Spark uses the hive-exec-core JAR, not hive-exec, so Hive's Guava shading doesn't apply (hence the need to shade here).

@sunchao
Copy link
Member Author

sunchao commented Sep 25, 2021

Hmm interesting. After I changed the isolated class loader to pick guava classes from Hive jars (which is of 14.0.1), tests started to fail because it now seems to use Spark's built-in Guava which is 30.1.1-jre. This doesn't seem to make sense.

[error] sbt.ForkMain$ForkError: java.lang.IllegalAccessError: tried to access method com.google.common.collect.Iterators.emptyIterator()Lcom/google/common/collect/UnmodifiableIterator; from class org.apache.hadoop.hive.ql.exec.FetchOperator
[error] 	at org.apache.hadoop.hive.ql.exec.FetchOperator.<init>(FetchOperator.java:108)
[error] 	at org.apache.hadoop.hive.ql.exec.FetchTask.initialize(FetchTask.java:87)
[error] 	at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:541)
[error] 	at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
[error] 	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
[error] 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
[error] 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
[error] 	at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$runHive$1(HiveClientImpl.scala:831)

Iterators.emptyIterator here is no longer public in the newer versions of guava.

@JoshRosen
Copy link
Contributor

JoshRosen commented Sep 28, 2021

I think the test failures seen above are due to the tests not picking up the shaded classes: I wouldn't expect to see a reference to non-relocated Guava classnames if shaded Hive was being used by the tests.

The SBT build doesn't perform any shading (because the sbt-pom-reader plugin only captures the project dependency structure, not build plugins). To date this hasn't been a problem since Spark's existing shading was only for the benefit of downstream consumers of Spark itself and wasn't necessary for Spark's internal functioning (and official releases are published with Maven, not SBT). With this new hive-shaded, though, we'll need to find a way to wire shaded classes into the SBT build.

I think we'll also run into similar problems in the Maven build. According to Maven's build lifecycle docs:

test - test the compiled source code using a suitable unit testing framework. These tests should not require the code be packaged or deployed

The shading runs during the package phase, which is after the test phase, so the tests won't be able to use the shaded artifacts.

The hive-shaded module doesn't depend on any source code from Spark. Given this, a clean path forward might be to publish this Maven artifact separately from Spark's main build, then depend on the published artifact. This would be similar to the approach taken in https://github.com/apache/hbase-thirdparty and https://github.com/apache/hadoop-thirdparty . This would require a bunch of new infra setup, though, so if we're considering pursuing this then we should open a new thread on the dev@ mailing list to discuss.

For now, you could test out that approach via manual local publishing: that'd unblock progress on testing the IsolatedClientLoader and other things.

Why did the tests pass before? It looks like your most recent commit in c2ec1f8 included the DependencyOverrides version bump in SparkBuild.scala as well as the change to the IsolatedClientLoader. Prior to the DependencyOverrides change, the SBT build was using both unshaded Hive and old Guava, so I think the previous successful tests were passing accidentally and that the new failures aren't caused by the IsolatedClientLoader change.

@JoshRosen
Copy link
Contributor

Also, I spent a bit of time re-reading the IsolatedClientLoader code and think I now have a better understanding of how it works and have a better explanation for why I think we should exclude Guava from sharedClasses:


First, let's consider the case where the metastore and execution Hive versions are the same:

In the else if (!isSharedClass(name)) branch we'll try to load from the isolated classloader itself via a super.loadClass call. This will first attempt to load classes from rootClassLoader and then will fall back on loading classes from the jars specified in allJars. When the built-in execution Hive is used, it looks like allJars gets populated from the context/Spark classloader's JARs (which is effectively the classpath used by Spark user code). In this case we'd expect Guava 30.1.1-jre to be used.

In the else branch (taken when isSharedClass(name) == true), we first try to load from baseClassLoader, which is the context/Spark classloader. Failing that, we then try try the steps described in the previous branch.

This makes sense to me: if the metastore and execution Hive versions are the same then there's no separate classpath of downloaded or user-provided metastore JARs to use, so we'll wind up loading classes using Spark's own classpath. In this case I think the only difference in these two branches is whether we'll use already-loaded classes from Spark's classloader or whether we'll load them again in the IsolatedClientLoader, but in both cases the classpath should be the same. If Spark's Guava version is the same as the version used by Hive then it's preferable to load from the shared classloader so that we avoid unnecessary duplicate classloading.


What happens when the metastore Hive version is not the same as the execution Hive version?

In that case, non-shared classes will be loaded from allJars, which should be a different classpath from Spark itself. As a result, I think we should exclude Guava from the shared classes.

If Spark resolves the metastore JARs from Maven, this allJars path should include a downloaded copy of Guava 14.0.1.

A common pattern that I've seen is that users will use Spark's existing Maven-based metastore resolution once to download the JARs, then will copy the result to a durable location to avoid re-downloading in the future. In that case they'll already have the required old version of Guava and things should continue to work while upgrading.

There's a tricky corner-case if a user has manually built a metastore classpath which includes only the dependencies not already provided by Spark: in that case, an upgrade might cause the user to run into dependency errors if Spark upgrades its Guava and the user's metastore JARs path doesn't include the old version. This is a purely hypothetical scenario and I imagine it would be rare, but it's potentially worth flagging in the docs' upgrade guide.

@JoshRosen
Copy link
Contributor

One more consideration: what about Hadoop 2.7 builds?

My understanding is that Hadoop used unshaded Guava until HADOOP-16924 in Hadoop 3.3.1.

Since Hadoop 2.7.4 still uses unshaded Guava, what are our options? We haven't decided to drop Hadoop 2.7 support yet. Should we continue to use unshaded old Guava under the Hadoop 2.7 profile? Is that potentially confusing to users?

(To be clear, I'm highly in favor of upgrading Guava; I just want to make sure we've thought through the implications and interactions for all of the different build and deployment environments).

@sunchao
Copy link
Member Author

sunchao commented Sep 28, 2021

Thanks @JoshRosen ! These are some great analysis!

I think we'll also run into similar problems in the Maven build. According to Maven's build lifecycle docs:

I've completely missed this 🤦 . Yes adding the hive-shaded module in Spark will not be a good idea given the above reasons on SBT and Maven test lifecycle, and now I understand why other projects put the shaded library in a different repo :)

Let me spend more time to revisit the following two paths:

  1. shade all the dependencies in Hive (e.g., via hive-exec fat jar) and make a new release, so Spark can start using that.
  2. create a ASF repo such as spark-thirdparty following the examples from HBase & Hadoop. This needs community discussion as you mentioned, and I'm not sure how much more burden it will add to Spark's maintenance procedure.

There's a tricky corner-case if a user has manually built a metastore classpath which includes only the dependencies not already provided by Spark

Thanks for the detailed explanation on how the IsolatedClientLoader works, and I agree this is a minor issue we should be aware of. We can either put something on the release notes, or perhaps exclude unshaded Guava jar completely from the Spark distribution (for hadoop-3.2). Currently this appears to be blocked by the curator-client dependency as discussed earlier in the PR, and perhaps there is still a way to ship only shaded Guava (from network-common) with those few classes required by curator-client excluded from relocation.

One more consideration: what about Hadoop 2.7 builds?

Another good question :) You are right that Hadoop 2.7 still uses unshaded Guava, while Hadoop 3.3.1 has switched to use shaded Guava via HADOOP-17288. In addition Spark is using shaded Hadoop client from HADOOP-11804 which further relocates other Hadoop dependencies so they won't pollute Spark's classpath.

I think one approach is to keep Guava 14.0.1 for hadoop-2.7 profile so everything still stay the same there. This at least will unblock us from upgrading Guava for the default hadoop-3.2 profile, and make sure all the published Spark artifacts will get the newer version of Guava. Also the aforementioned idea of excluding unshaded Guava from Spark distribution will only apply for the latter.

A crazier idea is to shade Hadoop 2.7 also if we are going with the spark-thirdparty approach but I'm not sure whether it's worth it given we are going to deprecate hadoop-2.7 eventually.

@pan3793
Copy link
Member

pan3793 commented Oct 12, 2021

  1. shade all the dependencies in Hive (e.g., via hive-exec fat jar) and make a new release, so Spark can start using that.

I love this way. Lots of downstream projects will benefit if Hive provides a shaded hive-exec jar that relocates anything(maybe we can give an exception to slf4j or Hadoop classes) but hive classes.

@mridulm
Copy link
Contributor

mridulm commented Oct 13, 2021

+CC @xkrogen, @shardulm94

@sunchao
Copy link
Member Author

sunchao commented Oct 14, 2021

I love this way. Lots of downstream projects will benefit if Hive provides a shaded hive-exec jar that relocates anything(maybe we can give an exception to slf4j or Hadoop classes) but hive classes.

Yes that would be ideal. There is some recent discussion on this in the Hive community as well, but IMO it could be very difficult to implement it because:

  1. it's hard to fully shade and relocate all Hive dependencies, since they could appear in public APIs crossing module & project boundaries. See this PR for some previous effort on this.
  2. Currently Hive only provide a fat hive-exec jar but not for other modules like hive-metastore, hive-beeline etc.
  3. Spark is using a very old Hive version 2.3.x: it could be a bit tricky to push the changes from master to this branch.

@github-actions
Copy link

We're closing this PR because it hasn't been updated in a while. This isn't a judgement on the merit of the PR in any way. It's just a way of keeping the PR queue manageable.
If you'd like to revive this PR, please reopen it and ask a committer to remove the Stale tag!

@github-actions github-actions bot added the Stale label Jan 24, 2022
@github-actions github-actions bot closed this Jan 25, 2022
@JoshRosen JoshRosen mentioned this pull request Aug 17, 2022
dongjoon-hyun pushed a commit that referenced this pull request Jan 4, 2024
…edClientLoader

### What changes were proposed in this pull request?

Try removing Guava from `sharedClasses` as suggested by JoshRosen in https://github.com/apache/spark/pull/33989#issuecomment-928616327 and https://github.com/apache/spark/pull/42493#issuecomment-1687092403

### Why are the changes needed?

Unblock Guava upgrading.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

CI passed (embedded HMS) and verified in the internal YARN cluster (remote HMS with kerberos-enabled).

```
# already setup hive-site.xml stuff properly to make sure to use remote HMS
bin/spark-shell --conf spark.sql.hive.metastore.jars=maven

...

scala> spark.sql("show databases").show
warning: 1 deprecation (since 2.13.3); for details, enable `:setting -deprecation` or `:replay -deprecation`
https://maven-central.storage-download.googleapis.com/maven2/ added as a remote repository with the name: repo-1
Ivy Default Cache set to: /home/hadoop/.ivy2/cache
The jars for the packages stored in: /home/hadoop/.ivy2/jars
org.apache.hive#hive-metastore added as a dependency
org.apache.hive#hive-exec added as a dependency
org.apache.hive#hive-common added as a dependency
org.apache.hive#hive-serde added as a dependency
org.apache.hadoop#hadoop-client-api added as a dependency
org.apache.hadoop#hadoop-client-runtime added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-d0d2962d-ae27-4526-a0c7-040a542e1e54;1.0
	confs: [default]
	found org.apache.hive#hive-metastore;2.3.9 in central
	found org.apache.hive#hive-serde;2.3.9 in central
	found org.apache.hive#hive-common;2.3.9 in central
	found org.apache.hive#hive-shims;2.3.9 in central
	found org.apache.hive.shims#hive-shims-common;2.3.9 in central
	found org.apache.logging.log4j#log4j-slf4j-impl;2.6.2 in central
	found org.slf4j#slf4j-api;1.7.10 in central
	found com.google.guava#guava;14.0.1 in central
	found commons-lang#commons-lang;2.6 in central
	found org.apache.thrift#libthrift;0.9.3 in central
	found org.apache.httpcomponents#httpclient;4.4 in central
	found org.apache.httpcomponents#httpcore;4.4 in central
	found commons-logging#commons-logging;1.2 in central
	found commons-codec#commons-codec;1.4 in central
	found org.apache.zookeeper#zookeeper;3.4.6 in central
	found org.slf4j#slf4j-log4j12;1.6.1 in central
	found log4j#log4j;1.2.16 in central
	found jline#jline;2.12 in central
	found io.netty#netty;3.7.0.Final in central
	found org.apache.hive.shims#hive-shims-0.23;2.3.9 in central
	found org.apache.hadoop#hadoop-yarn-server-resourcemanager;2.7.2 in central
	found org.apache.hadoop#hadoop-annotations;2.7.2 in central
	found com.google.inject.extensions#guice-servlet;3.0 in central
	found com.google.inject#guice;3.0 in central
	found javax.inject#javax.inject;1 in central
	found aopalliance#aopalliance;1.0 in central
	found org.sonatype.sisu.inject#cglib;2.2.1-v20090111 in central
	found asm#asm;3.2 in central
	found com.google.protobuf#protobuf-java;2.5.0 in central
	found commons-io#commons-io;2.4 in central
	found com.sun.jersey#jersey-json;1.14 in central
	found org.codehaus.jettison#jettison;1.1 in central
	found com.sun.xml.bind#jaxb-impl;2.2.3-1 in central
	found javax.xml.bind#jaxb-api;2.2.2 in central
	found javax.xml.stream#stax-api;1.0-2 in central
	found javax.activation#activation;1.1 in central
	found org.codehaus.jackson#jackson-core-asl;1.9.13 in central
	found org.codehaus.jackson#jackson-mapper-asl;1.9.13 in central
	found org.codehaus.jackson#jackson-jaxrs;1.9.13 in central
	found org.codehaus.jackson#jackson-xc;1.9.13 in central
	found com.sun.jersey#jersey-core;1.14 in central
	found com.sun.jersey.contribs#jersey-guice;1.9 in central
	found com.sun.jersey#jersey-server;1.14 in central
	found org.apache.hadoop#hadoop-yarn-common;2.7.2 in central
	found org.apache.hadoop#hadoop-yarn-api;2.7.2 in central
	found org.apache.commons#commons-compress;1.9 in central
	found org.mortbay.jetty#jetty-util;6.1.26 in central
	found com.sun.jersey#jersey-client;1.9 in central
	found commons-cli#commons-cli;1.2 in central
	found log4j#log4j;1.2.17 in central
	found org.apache.hadoop#hadoop-yarn-server-common;2.7.2 in central
	found org.fusesource.leveldbjni#leveldbjni-all;1.8 in central
	found org.apache.hadoop#hadoop-yarn-server-applicationhistoryservice;2.7.2 in central
	found commons-collections#commons-collections;3.2.2 in central
	found org.apache.hadoop#hadoop-yarn-server-web-proxy;2.7.2 in central
	found org.mortbay.jetty#jetty;6.1.26 in central
	found org.apache.hive.shims#hive-shims-scheduler;2.3.9 in central
	found org.apache.hive#hive-storage-api;2.4.0 in central
	found org.apache.commons#commons-lang3;3.1 in central
	found org.apache.orc#orc-core;1.3.4 in central
	found io.airlift#aircompressor;0.8 in central
	found io.airlift#slice;0.29 in central
	found org.openjdk.jol#jol-core;0.2 in central
	found org.eclipse.jetty.aggregate#jetty-all;7.6.0.v20120127 in central
	found org.apache.geronimo.specs#geronimo-jta_1.1_spec;1.1.1 in central
	found javax.mail#mail;1.4.1 in central
	found org.apache.geronimo.specs#geronimo-jaspic_1.0_spec;1.0 in central
	found org.apache.geronimo.specs#geronimo-annotation_1.0_spec;1.1.1 in central
	found asm#asm-commons;3.1 in central
	found asm#asm-tree;3.1 in central
	found org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016 in central
	found joda-time#joda-time;2.8.1 in central
	found org.apache.logging.log4j#log4j-1.2-api;2.6.2 in central
	found org.apache.logging.log4j#log4j-web;2.6.2 in central
	found org.apache.ant#ant;1.9.1 in central
	found org.apache.ant#ant-launcher;1.9.1 in central
	found com.tdunning#json;1.8 in central
	found io.dropwizard.metrics#metrics-core;3.1.0 in central
	found io.dropwizard.metrics#metrics-jvm;3.1.0 in central
	found io.dropwizard.metrics#metrics-json;3.1.0 in central
	found com.github.joshelser#dropwizard-metrics-hadoop-metrics2-reporter;0.1.2 in central
	found org.apache.hadoop#hadoop-common;2.7.2 in central
	found org.apache.commons#commons-math3;3.1.1 in central
	found xmlenc#xmlenc;0.52 in central
	found commons-httpclient#commons-httpclient;3.1 in central
	found commons-net#commons-net;3.1 in central
	found javax.servlet#servlet-api;2.5 in central
	found net.java.dev.jets3t#jets3t;0.9.0 in central
	found com.jamesmurty.utils#java-xmlbuilder;0.4 in central
	found commons-configuration#commons-configuration;1.6 in central
	found commons-digester#commons-digester;1.8 in central
	found commons-beanutils#commons-beanutils;1.7.0 in central
	found commons-beanutils#commons-beanutils-core;1.8.0 in central
	found org.apache.avro#avro;1.8.2 in central
	found com.thoughtworks.paranamer#paranamer;2.7 in central
	found org.xerial.snappy#snappy-java;1.1.1.3 in central
	found org.tukaani#xz;1.5 in central
	found com.google.code.gson#gson;2.2.4 in central
	found org.apache.hadoop#hadoop-auth;2.7.2 in central
	found org.apache.directory.server#apacheds-kerberos-codec;2.0.0-M15 in central
	found org.apache.directory.server#apacheds-i18n;2.0.0-M15 in central
	found org.apache.directory.api#api-asn1-api;1.0.0-M20 in central
	found org.apache.directory.api#api-util;1.0.0-M20 in central
	found com.jcraft#jsch;0.1.42 in central
	found com.google.code.findbugs#jsr305;3.0.0 in central
	found org.apache.htrace#htrace-core;3.1.0-incubating in central
	found javax.servlet.jsp#jsp-api;2.1 in central
	found org.slf4j#slf4j-log4j12;1.7.14 in central
	found org.apache.hive#hive-service-rpc;2.3.9 in central
	found tomcat#jasper-compiler;5.5.23 in central
	found javax.servlet#jsp-api;2.0 in central
	found ant#ant;1.6.5 in central
	found tomcat#jasper-runtime;5.5.23 in central
	found commons-el#commons-el;1.0 in central
	found org.apache.thrift#libfb303;0.9.3 in central
	found net.sf.opencsv#opencsv;2.3 in central
	found org.apache.parquet#parquet-hadoop-bundle;1.8.1 in central
	found javolution#javolution;5.5.1 in central
	found org.apache.hbase#hbase-client;1.1.1 in central
	found org.apache.hbase#hbase-annotations;1.1.1 in central
	found com.github.stephenc.findbugs#findbugs-annotations;1.3.9-1 in central
	found junit#junit;4.11 in central
	found org.hamcrest#hamcrest-core;1.3 in central
	found org.apache.hbase#hbase-protocol;1.1.1 in central
	found io.netty#netty-all;4.0.52.Final in central
	found org.jruby.jcodings#jcodings;1.0.8 in central
	found org.jruby.joni#joni;2.1.2 in central
	found org.apache.hadoop#hadoop-mapreduce-client-core;2.7.2 in central
	found com.jolbox#bonecp;0.8.0.RELEASE in central
	found com.zaxxer#HikariCP;2.5.1 in central
	found org.apache.derby#derby;10.10.2.0 in central
	found org.datanucleus#datanucleus-api-jdo;4.2.4 in central
	found org.datanucleus#datanucleus-core;4.1.17 in central
	found org.datanucleus#datanucleus-rdbms;4.1.19 in central
	found commons-pool#commons-pool;1.5.4 in central
	found commons-dbcp#commons-dbcp;1.4 in central
	found javax.jdo#jdo-api;3.0.1 in central
	found javax.transaction#jta;1.1 in central
	found org.datanucleus#javax.jdo;3.2.0-m3 in central
	found javax.transaction#transaction-api;1.1 in central
	found org.antlr#antlr-runtime;3.5.2 in central
	found co.cask.tephra#tephra-api;0.6.0 in central
	found co.cask.tephra#tephra-core;0.6.0 in central
	found com.google.inject.extensions#guice-assistedinject;3.0 in central
	found it.unimi.dsi#fastutil;6.5.6 in central
	found org.apache.twill#twill-common;0.6.0-incubating in central
	found org.apache.twill#twill-core;0.6.0-incubating in central
	found org.apache.twill#twill-api;0.6.0-incubating in central
	found org.apache.twill#twill-discovery-api;0.6.0-incubating in central
	found org.apache.twill#twill-zookeeper;0.6.0-incubating in central
	found org.apache.twill#twill-discovery-core;0.6.0-incubating in central
	found co.cask.tephra#tephra-hbase-compat-1.0;0.6.0 in central
	found org.apache.hive#hive-exec;2.3.9 in central
	found org.apache.hive#hive-llap-tez;2.3.9 in central
	found org.apache.hive#hive-llap-client;2.3.9 in central
	found org.apache.hive#hive-llap-common;2.3.9 in central
	found org.antlr#ST4;4.0.4 in central
	found org.apache.ivy#ivy;2.4.0 in central
	found org.codehaus.groovy#groovy-all;2.4.4 in central
	found stax#stax-api;1.0.1 in central
	found net.hydromatic#eigenbase-properties;1.1.5 in central
	found org.codehaus.janino#commons-compiler;2.7.6 in central
	found org.codehaus.janino#janino;2.7.6 in central
	found org.apache.hadoop#hadoop-client-api;3.3.6 in central
	found org.xerial.snappy#snappy-java;1.1.8.2 in central
	found org.apache.hadoop#hadoop-client-runtime;3.3.6 in central
	found org.slf4j#slf4j-api;1.7.36 in central
	found com.google.code.findbugs#jsr305;3.0.2 in central
downloading https://repo1.maven.org/maven2/org/apache/hive/hive-metastore/2.3.9/hive-metastore-2.3.9.jar ...
	[SUCCESSFUL ] org.apache.hive#hive-metastore;2.3.9!hive-metastore.jar (1292ms)
downloading https://repo1.maven.org/maven2/org/apache/hive/hive-exec/2.3.9/hive-exec-2.3.9.jar ...
	[SUCCESSFUL ] org.apache.hive#hive-exec;2.3.9!hive-exec.jar (1769ms)
downloading https://repo1.maven.org/maven2/org/apache/hive/hive-common/2.3.9/hive-common-2.3.9.jar ...
	[SUCCESSFUL ] org.apache.hive#hive-common;2.3.9!hive-common.jar (360ms)
downloading https://repo1.maven.org/maven2/org/apache/hive/hive-serde/2.3.9/hive-serde-2.3.9.jar ...
	[SUCCESSFUL ] org.apache.hive#hive-serde;2.3.9!hive-serde.jar (370ms)
downloading https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-client-api/3.3.6/hadoop-client-api-3.3.6.jar ...
	[SUCCESSFUL ] org.apache.hadoop#hadoop-client-api;3.3.6!hadoop-client-api.jar (936ms)
downloading https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-client-runtime/3.3.6/hadoop-client-runtime-3.3.6.jar ...
	[SUCCESSFUL ] org.apache.hadoop#hadoop-client-runtime;3.3.6!hadoop-client-runtime.jar (1418ms)
downloading https://repo1.maven.org/maven2/org/apache/hive/hive-shims/2.3.9/hive-shims-2.3.9.jar ...
	[SUCCESSFUL ] org.apache.hive#hive-shims;2.3.9!hive-shims.jar (361ms)
downloading https://repo1.maven.org/maven2/javolution/javolution/5.5.1/javolution-5.5.1.jar ...
	[SUCCESSFUL ] javolution#javolution;5.5.1!javolution.jar(bundle) (359ms)
downloading https://repo1.maven.org/maven2/com/google/guava/guava/14.0.1/guava-14.0.1.jar ...
	[SUCCESSFUL ] com.google.guava#guava;14.0.1!guava.jar(bundle) (385ms)
downloading https://repo1.maven.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar ...
	[SUCCESSFUL ] com.google.protobuf#protobuf-java;2.5.0!protobuf-java.jar(bundle) (362ms)
downloading https://repo1.maven.org/maven2/org/apache/hbase/hbase-client/1.1.1/hbase-client-1.1.1.jar ...
	[SUCCESSFUL ] org.apache.hbase#hbase-client;1.1.1!hbase-client.jar (378ms)
downloading https://repo1.maven.org/maven2/com/jolbox/bonecp/0.8.0.RELEASE/bonecp-0.8.0.RELEASE.jar ...
	[SUCCESSFUL ] com.jolbox#bonecp;0.8.0.RELEASE!bonecp.jar(bundle) (356ms)
downloading https://repo1.maven.org/maven2/com/zaxxer/HikariCP/2.5.1/HikariCP-2.5.1.jar ...
	[SUCCESSFUL ] com.zaxxer#HikariCP;2.5.1!HikariCP.jar(bundle) (356ms)
downloading https://repo1.maven.org/maven2/commons-cli/commons-cli/1.2/commons-cli-1.2.jar ...
	[SUCCESSFUL ] commons-cli#commons-cli;1.2!commons-cli.jar (353ms)
downloading https://repo1.maven.org/maven2/commons-lang/commons-lang/2.6/commons-lang-2.6.jar ...
	[SUCCESSFUL ] commons-lang#commons-lang;2.6!commons-lang.jar (372ms)
downloading https://repo1.maven.org/maven2/org/apache/derby/derby/10.10.2.0/derby-10.10.2.0.jar ...
	[SUCCESSFUL ] org.apache.derby#derby;10.10.2.0!derby.jar (398ms)
downloading https://repo1.maven.org/maven2/org/datanucleus/datanucleus-api-jdo/4.2.4/datanucleus-api-jdo-4.2.4.jar ...
	[SUCCESSFUL ] org.datanucleus#datanucleus-api-jdo;4.2.4!datanucleus-api-jdo.jar (361ms)
downloading https://repo1.maven.org/maven2/org/datanucleus/datanucleus-core/4.1.17/datanucleus-core-4.1.17.jar ...
	[SUCCESSFUL ] org.datanucleus#datanucleus-core;4.1.17!datanucleus-core.jar (391ms)
downloading https://repo1.maven.org/maven2/org/datanucleus/datanucleus-rdbms/4.1.19/datanucleus-rdbms-4.1.19.jar ...
	[SUCCESSFUL ] org.datanucleus#datanucleus-rdbms;4.1.19!datanucleus-rdbms.jar (384ms)
downloading https://repo1.maven.org/maven2/commons-pool/commons-pool/1.5.4/commons-pool-1.5.4.jar ...
	[SUCCESSFUL ] commons-pool#commons-pool;1.5.4!commons-pool.jar (354ms)
downloading https://repo1.maven.org/maven2/commons-dbcp/commons-dbcp/1.4/commons-dbcp-1.4.jar ...
	[SUCCESSFUL ] commons-dbcp#commons-dbcp;1.4!commons-dbcp.jar (355ms)
downloading https://repo1.maven.org/maven2/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar ...
	[SUCCESSFUL ] javax.jdo#jdo-api;3.0.1!jdo-api.jar (357ms)
downloading https://repo1.maven.org/maven2/org/datanucleus/javax.jdo/3.2.0-m3/javax.jdo-3.2.0-m3.jar ...
	[SUCCESSFUL ] org.datanucleus#javax.jdo;3.2.0-m3!javax.jdo.jar (357ms)
downloading https://repo1.maven.org/maven2/org/antlr/antlr-runtime/3.5.2/antlr-runtime-3.5.2.jar ...
	[SUCCESSFUL ] org.antlr#antlr-runtime;3.5.2!antlr-runtime.jar (355ms)
downloading https://repo1.maven.org/maven2/org/apache/thrift/libfb303/0.9.3/libfb303-0.9.3.jar ...
	[SUCCESSFUL ] org.apache.thrift#libfb303;0.9.3!libfb303.jar (181ms)
downloading https://repo1.maven.org/maven2/org/apache/thrift/libthrift/0.9.3/libthrift-0.9.3.jar ...
	[SUCCESSFUL ] org.apache.thrift#libthrift;0.9.3!libthrift.jar (183ms)
downloading https://repo1.maven.org/maven2/co/cask/tephra/tephra-api/0.6.0/tephra-api-0.6.0.jar ...
	[SUCCESSFUL ] co.cask.tephra#tephra-api;0.6.0!tephra-api.jar (353ms)
downloading https://repo1.maven.org/maven2/co/cask/tephra/tephra-core/0.6.0/tephra-core-0.6.0.jar ...
	[SUCCESSFUL ] co.cask.tephra#tephra-core;0.6.0!tephra-core.jar (363ms)
downloading https://repo1.maven.org/maven2/co/cask/tephra/tephra-hbase-compat-1.0/0.6.0/tephra-hbase-compat-1.0-0.6.0.jar ...
	[SUCCESSFUL ] co.cask.tephra#tephra-hbase-compat-1.0;0.6.0!tephra-hbase-compat-1.0.jar (354ms)
downloading https://repo1.maven.org/maven2/org/apache/hive/hive-service-rpc/2.3.9/hive-service-rpc-2.3.9.jar ...
	[SUCCESSFUL ] org.apache.hive#hive-service-rpc;2.3.9!hive-service-rpc.jar (379ms)
downloading https://repo1.maven.org/maven2/commons-codec/commons-codec/1.4/commons-codec-1.4.jar ...
	[SUCCESSFUL ] commons-codec#commons-codec;1.4!commons-codec.jar (355ms)
downloading https://repo1.maven.org/maven2/org/apache/avro/avro/1.8.2/avro-1.8.2.jar ...
	[SUCCESSFUL ] org.apache.avro#avro;1.8.2!avro.jar(bundle) (392ms)
downloading https://repo1.maven.org/maven2/net/sf/opencsv/opencsv/2.3/opencsv-2.3.jar ...
	[SUCCESSFUL ] net.sf.opencsv#opencsv;2.3!opencsv.jar (354ms)
downloading https://repo1.maven.org/maven2/org/apache/parquet/parquet-hadoop-bundle/1.8.1/parquet-hadoop-bundle-1.8.1.jar ...
	[SUCCESSFUL ] org.apache.parquet#parquet-hadoop-bundle;1.8.1!parquet-hadoop-bundle.jar (408ms)
downloading https://repo1.maven.org/maven2/org/apache/hive/hive-storage-api/2.4.0/hive-storage-api-2.4.0.jar ...
	[SUCCESSFUL ] org.apache.hive#hive-storage-api;2.4.0!hive-storage-api.jar (356ms)
downloading https://repo1.maven.org/maven2/org/apache/commons/commons-lang3/3.1/commons-lang3-3.1.jar ...
	[SUCCESSFUL ] org.apache.commons#commons-lang3;3.1!commons-lang3.jar (358ms)
downloading https://repo1.maven.org/maven2/org/apache/orc/orc-core/1.3.4/orc-core-1.3.4.jar ...
	[SUCCESSFUL ] org.apache.orc#orc-core;1.3.4!orc-core.jar (391ms)
downloading https://repo1.maven.org/maven2/jline/jline/2.12/jline-2.12.jar ...
	[SUCCESSFUL ] jline#jline;2.12!jline.jar (357ms)
downloading https://repo1.maven.org/maven2/org/eclipse/jetty/aggregate/jetty-all/7.6.0.v20120127/jetty-all-7.6.0.v20120127.jar ...
	[SUCCESSFUL ] org.eclipse.jetty.aggregate#jetty-all;7.6.0.v20120127!jetty-all.jar (379ms)
downloading https://repo1.maven.org/maven2/org/eclipse/jetty/orbit/javax.servlet/3.0.0.v201112011016/javax.servlet-3.0.0.v201112011016.jar ...
	[SUCCESSFUL ] org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.jar(orbit) (359ms)
downloading https://repo1.maven.org/maven2/joda-time/joda-time/2.8.1/joda-time-2.8.1.jar ...
	[SUCCESSFUL ] joda-time#joda-time;2.8.1!joda-time.jar (361ms)
downloading https://repo1.maven.org/maven2/org/apache/logging/log4j/log4j-1.2-api/2.6.2/log4j-1.2-api-2.6.2.jar ...
	[SUCCESSFUL ] org.apache.logging.log4j#log4j-1.2-api;2.6.2!log4j-1.2-api.jar(bundle) (354ms)
downloading https://repo1.maven.org/maven2/org/apache/logging/log4j/log4j-web/2.6.2/log4j-web-2.6.2.jar ...
	[SUCCESSFUL ] org.apache.logging.log4j#log4j-web;2.6.2!log4j-web.jar (354ms)
downloading https://repo1.maven.org/maven2/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar ...
	[SUCCESSFUL ] org.apache.logging.log4j#log4j-slf4j-impl;2.6.2!log4j-slf4j-impl.jar (354ms)
downloading https://repo1.maven.org/maven2/org/apache/commons/commons-compress/1.9/commons-compress-1.9.jar ...
	[SUCCESSFUL ] org.apache.commons#commons-compress;1.9!commons-compress.jar (359ms)
downloading https://repo1.maven.org/maven2/org/apache/ant/ant/1.9.1/ant-1.9.1.jar ...
	[SUCCESSFUL ] org.apache.ant#ant;1.9.1!ant.jar (390ms)
downloading https://repo1.maven.org/maven2/com/tdunning/json/1.8/json-1.8.jar ...
	[SUCCESSFUL ] com.tdunning#json;1.8!json.jar (353ms)
downloading https://repo1.maven.org/maven2/io/dropwizard/metrics/metrics-core/3.1.0/metrics-core-3.1.0.jar ...
	[SUCCESSFUL ] io.dropwizard.metrics#metrics-core;3.1.0!metrics-core.jar(bundle) (356ms)
downloading https://repo1.maven.org/maven2/io/dropwizard/metrics/metrics-jvm/3.1.0/metrics-jvm-3.1.0.jar ...
	[SUCCESSFUL ] io.dropwizard.metrics#metrics-jvm;3.1.0!metrics-jvm.jar(bundle) (354ms)
downloading https://repo1.maven.org/maven2/io/dropwizard/metrics/metrics-json/3.1.0/metrics-json-3.1.0.jar ...
	[SUCCESSFUL ] io.dropwizard.metrics#metrics-json;3.1.0!metrics-json.jar(bundle) (354ms)
downloading https://repo1.maven.org/maven2/com/github/joshelser/dropwizard-metrics-hadoop-metrics2-reporter/0.1.2/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar ...
	[SUCCESSFUL ] com.github.joshelser#dropwizard-metrics-hadoop-metrics2-reporter;0.1.2!dropwizard-metrics-hadoop-metrics2-reporter.jar (354ms)
downloading https://repo1.maven.org/maven2/org/apache/hive/shims/hive-shims-common/2.3.9/hive-shims-common-2.3.9.jar ...
	[SUCCESSFUL ] org.apache.hive.shims#hive-shims-common;2.3.9!hive-shims-common.jar (355ms)
downloading https://repo1.maven.org/maven2/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar ...
	[SUCCESSFUL ] org.apache.zookeeper#zookeeper;3.4.6!zookeeper.jar(test-jar) (362ms)
downloading https://repo1.maven.org/maven2/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6.jar ...
	[SUCCESSFUL ] org.apache.zookeeper#zookeeper;3.4.6!zookeeper.jar (188ms)
downloading https://repo1.maven.org/maven2/org/apache/httpcomponents/httpclient/4.4/httpclient-4.4.jar ...
	[SUCCESSFUL ] org.apache.httpcomponents#httpclient;4.4!httpclient.jar (368ms)
downloading https://repo1.maven.org/maven2/org/apache/httpcomponents/httpcore/4.4/httpcore-4.4.jar ...
	[SUCCESSFUL ] org.apache.httpcomponents#httpcore;4.4!httpcore.jar (359ms)
downloading https://repo1.maven.org/maven2/commons-logging/commons-logging/1.2/commons-logging-1.2.jar ...
	[SUCCESSFUL ] commons-logging#commons-logging;1.2!commons-logging.jar (353ms)
downloading https://repo1.maven.org/maven2/io/netty/netty/3.7.0.Final/netty-3.7.0.Final.jar ...
	[SUCCESSFUL ] io.netty#netty;3.7.0.Final!netty.jar(bundle) (373ms)
downloading https://repo1.maven.org/maven2/org/apache/hive/shims/hive-shims-0.23/2.3.9/hive-shims-0.23-2.3.9.jar ...
	[SUCCESSFUL ] org.apache.hive.shims#hive-shims-0.23;2.3.9!hive-shims-0.23.jar (359ms)
downloading https://repo1.maven.org/maven2/org/apache/hive/shims/hive-shims-scheduler/2.3.9/hive-shims-scheduler-2.3.9.jar ...
	[SUCCESSFUL ] org.apache.hive.shims#hive-shims-scheduler;2.3.9!hive-shims-scheduler.jar (355ms)
downloading https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-server-resourcemanager/2.7.2/hadoop-yarn-server-resourcemanager-2.7.2.jar ...
	[SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-server-resourcemanager;2.7.2!hadoop-yarn-server-resourcemanager.jar (375ms)
downloading https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-annotations/2.7.2/hadoop-annotations-2.7.2.jar ...
	[SUCCESSFUL ] org.apache.hadoop#hadoop-annotations;2.7.2!hadoop-annotations.jar (361ms)
downloading https://repo1.maven.org/maven2/com/google/inject/extensions/guice-servlet/3.0/guice-servlet-3.0.jar ...
	[SUCCESSFUL ] com.google.inject.extensions#guice-servlet;3.0!guice-servlet.jar (355ms)
downloading https://repo1.maven.org/maven2/commons-io/commons-io/2.4/commons-io-2.4.jar ...
	[SUCCESSFUL ] commons-io#commons-io;2.4!commons-io.jar (357ms)
downloading https://repo1.maven.org/maven2/com/google/inject/guice/3.0/guice-3.0.jar ...
	[SUCCESSFUL ] com.google.inject#guice;3.0!guice.jar (368ms)
downloading https://repo1.maven.org/maven2/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar ...
	[SUCCESSFUL ] com.sun.jersey#jersey-json;1.14!jersey-json.jar (356ms)
downloading https://repo1.maven.org/maven2/com/sun/jersey/contribs/jersey-guice/1.9/jersey-guice-1.9.jar ...
	[SUCCESSFUL ] com.sun.jersey.contribs#jersey-guice;1.9!jersey-guice.jar (354ms)
downloading https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-common/2.7.2/hadoop-yarn-common-2.7.2.jar ...
	[SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-common;2.7.2!hadoop-yarn-common.jar (379ms)
downloading https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-api/2.7.2/hadoop-yarn-api-2.7.2.jar ...
	[SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-api;2.7.2!hadoop-yarn-api.jar (386ms)
downloading https://repo1.maven.org/maven2/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar ...
	[SUCCESSFUL ] javax.xml.bind#jaxb-api;2.2.2!jaxb-api.jar (356ms)
downloading https://repo1.maven.org/maven2/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar ...
	[SUCCESSFUL ] org.codehaus.jettison#jettison;1.1!jettison.jar(bundle) (354ms)
downloading https://repo1.maven.org/maven2/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar ...
	[SUCCESSFUL ] com.sun.jersey#jersey-core;1.14!jersey-core.jar (360ms)
downloading https://repo1.maven.org/maven2/com/sun/jersey/jersey-client/1.9/jersey-client-1.9.jar ...
	[SUCCESSFUL ] com.sun.jersey#jersey-client;1.9!jersey-client.jar(bundle) (355ms)
downloading https://repo1.maven.org/maven2/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar ...
	[SUCCESSFUL ] org.mortbay.jetty#jetty-util;6.1.26!jetty-util.jar (355ms)
downloading https://repo1.maven.org/maven2/log4j/log4j/1.2.17/log4j-1.2.17.jar ...
	[SUCCESSFUL ] log4j#log4j;1.2.17!log4j.jar(bundle) (361ms)
downloading https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-server-common/2.7.2/hadoop-yarn-server-common-2.7.2.jar ...
	[SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-server-common;2.7.2!hadoop-yarn-server-common.jar (367ms)
downloading https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-server-applicationhistoryservice/2.7.2/hadoop-yarn-server-applicationhistoryservice-2.7.2.jar ...
	[SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-server-applicationhistoryservice;2.7.2!hadoop-yarn-server-applicationhistoryservice.jar (363ms)
downloading https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-server-web-proxy/2.7.2/hadoop-yarn-server-web-proxy-2.7.2.jar ...
	[SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-server-web-proxy;2.7.2!hadoop-yarn-server-web-proxy.jar (354ms)
downloading https://repo1.maven.org/maven2/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar ...
	[SUCCESSFUL ] org.fusesource.leveldbjni#leveldbjni-all;1.8!leveldbjni-all.jar(bundle) (372ms)
downloading https://repo1.maven.org/maven2/javax/inject/javax.inject/1/javax.inject-1.jar ...
	[SUCCESSFUL ] javax.inject#javax.inject;1!javax.inject.jar (355ms)
downloading https://repo1.maven.org/maven2/aopalliance/aopalliance/1.0/aopalliance-1.0.jar ...
	[SUCCESSFUL ] aopalliance#aopalliance;1.0!aopalliance.jar (368ms)
downloading https://repo1.maven.org/maven2/org/sonatype/sisu/inject/cglib/2.2.1-v20090111/cglib-2.2.1-v20090111.jar ...
	[SUCCESSFUL ] org.sonatype.sisu.inject#cglib;2.2.1-v20090111!cglib.jar (384ms)
downloading https://repo1.maven.org/maven2/asm/asm/3.2/asm-3.2.jar ...
	[SUCCESSFUL ] asm#asm;3.2!asm.jar (355ms)
downloading https://repo1.maven.org/maven2/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar ...
	[SUCCESSFUL ] com.sun.xml.bind#jaxb-impl;2.2.3-1!jaxb-impl.jar (367ms)
downloading https://repo1.maven.org/maven2/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar ...
	[SUCCESSFUL ] org.codehaus.jackson#jackson-core-asl;1.9.13!jackson-core-asl.jar (357ms)
downloading https://repo1.maven.org/maven2/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar ...
	[SUCCESSFUL ] org.codehaus.jackson#jackson-mapper-asl;1.9.13!jackson-mapper-asl.jar (365ms)
downloading https://repo1.maven.org/maven2/org/codehaus/jackson/jackson-jaxrs/1.9.13/jackson-jaxrs-1.9.13.jar ...
	[SUCCESSFUL ] org.codehaus.jackson#jackson-jaxrs;1.9.13!jackson-jaxrs.jar (356ms)
downloading https://repo1.maven.org/maven2/org/codehaus/jackson/jackson-xc/1.9.13/jackson-xc-1.9.13.jar ...
	[SUCCESSFUL ] org.codehaus.jackson#jackson-xc;1.9.13!jackson-xc.jar (354ms)
downloading https://repo1.maven.org/maven2/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar ...
	[SUCCESSFUL ] javax.xml.stream#stax-api;1.0-2!stax-api.jar (354ms)
downloading https://repo1.maven.org/maven2/javax/activation/activation/1.1/activation-1.1.jar ...
	[SUCCESSFUL ] javax.activation#activation;1.1!activation.jar (354ms)
downloading https://repo1.maven.org/maven2/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar ...
	[SUCCESSFUL ] com.sun.jersey#jersey-server;1.14!jersey-server.jar (362ms)
downloading https://repo1.maven.org/maven2/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar ...
	[SUCCESSFUL ] commons-collections#commons-collections;3.2.2!commons-collections.jar (366ms)
downloading https://repo1.maven.org/maven2/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar ...
	[SUCCESSFUL ] org.mortbay.jetty#jetty;6.1.26!jetty.jar (362ms)
downloading https://repo1.maven.org/maven2/io/airlift/aircompressor/0.8/aircompressor-0.8.jar ...
	[SUCCESSFUL ] io.airlift#aircompressor;0.8!aircompressor.jar (358ms)
downloading https://repo1.maven.org/maven2/io/airlift/slice/0.29/slice-0.29.jar ...
	[SUCCESSFUL ] io.airlift#slice;0.29!slice.jar (355ms)
downloading https://repo1.maven.org/maven2/org/openjdk/jol/jol-core/0.2/jol-core-0.2.jar ...
	[SUCCESSFUL ] org.openjdk.jol#jol-core;0.2!jol-core.jar (357ms)
downloading https://repo1.maven.org/maven2/org/apache/geronimo/specs/geronimo-jta_1.1_spec/1.1.1/geronimo-jta_1.1_spec-1.1.1.jar ...
	[SUCCESSFUL ] org.apache.geronimo.specs#geronimo-jta_1.1_spec;1.1.1!geronimo-jta_1.1_spec.jar (354ms)
downloading https://repo1.maven.org/maven2/javax/mail/mail/1.4.1/mail-1.4.1.jar ...
	[SUCCESSFUL ] javax.mail#mail;1.4.1!mail.jar (361ms)
downloading https://repo1.maven.org/maven2/org/apache/geronimo/specs/geronimo-jaspic_1.0_spec/1.0/geronimo-jaspic_1.0_spec-1.0.jar ...
	[SUCCESSFUL ] org.apache.geronimo.specs#geronimo-jaspic_1.0_spec;1.0!geronimo-jaspic_1.0_spec.jar(bundle) (354ms)
downloading https://repo1.maven.org/maven2/org/apache/geronimo/specs/geronimo-annotation_1.0_spec/1.1.1/geronimo-annotation_1.0_spec-1.1.1.jar ...
	[SUCCESSFUL ] org.apache.geronimo.specs#geronimo-annotation_1.0_spec;1.1.1!geronimo-annotation_1.0_spec.jar (354ms)
downloading https://repo1.maven.org/maven2/asm/asm-commons/3.1/asm-commons-3.1.jar ...
	[SUCCESSFUL ] asm#asm-commons;3.1!asm-commons.jar (354ms)
downloading https://repo1.maven.org/maven2/asm/asm-tree/3.1/asm-tree-3.1.jar ...
	[SUCCESSFUL ] asm#asm-tree;3.1!asm-tree.jar (356ms)
downloading https://repo1.maven.org/maven2/org/apache/ant/ant-launcher/1.9.1/ant-launcher-1.9.1.jar ...
	[SUCCESSFUL ] org.apache.ant#ant-launcher;1.9.1!ant-launcher.jar (353ms)
downloading https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-common/2.7.2/hadoop-common-2.7.2.jar ...
	[SUCCESSFUL ] org.apache.hadoop#hadoop-common;2.7.2!hadoop-common.jar (424ms)
downloading https://repo1.maven.org/maven2/org/apache/commons/commons-math3/3.1.1/commons-math3-3.1.1.jar ...
	[SUCCESSFUL ] org.apache.commons#commons-math3;3.1.1!commons-math3.jar (388ms)
downloading https://repo1.maven.org/maven2/xmlenc/xmlenc/0.52/xmlenc-0.52.jar ...
	[SUCCESSFUL ] xmlenc#xmlenc;0.52!xmlenc.jar (354ms)
downloading https://repo1.maven.org/maven2/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar ...
	[SUCCESSFUL ] commons-httpclient#commons-httpclient;3.1!commons-httpclient.jar (363ms)
downloading https://repo1.maven.org/maven2/commons-net/commons-net/3.1/commons-net-3.1.jar ...
	[SUCCESSFUL ] commons-net#commons-net;3.1!commons-net.jar (360ms)
downloading https://repo1.maven.org/maven2/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar ...
	[SUCCESSFUL ] javax.servlet#servlet-api;2.5!servlet-api.jar (360ms)
downloading https://repo1.maven.org/maven2/net/java/dev/jets3t/jets3t/0.9.0/jets3t-0.9.0.jar ...
	[SUCCESSFUL ] net.java.dev.jets3t#jets3t;0.9.0!jets3t.jar (363ms)
downloading https://repo1.maven.org/maven2/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar ...
	[SUCCESSFUL ] commons-configuration#commons-configuration;1.6!commons-configuration.jar (366ms)
downloading https://repo1.maven.org/maven2/com/google/code/gson/gson/2.2.4/gson-2.2.4.jar ...
	[SUCCESSFUL ] com.google.code.gson#gson;2.2.4!gson.jar (367ms)
downloading https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-auth/2.7.2/hadoop-auth-2.7.2.jar ...
	[SUCCESSFUL ] org.apache.hadoop#hadoop-auth;2.7.2!hadoop-auth.jar (360ms)
downloading https://repo1.maven.org/maven2/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar ...
	[SUCCESSFUL ] com.jcraft#jsch;0.1.42!jsch.jar (360ms)
downloading https://repo1.maven.org/maven2/org/apache/htrace/htrace-core/3.1.0-incubating/htrace-core-3.1.0-incubating.jar ...
	[SUCCESSFUL ] org.apache.htrace#htrace-core;3.1.0-incubating!htrace-core.jar (375ms)
downloading https://repo1.maven.org/maven2/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar ...
	[SUCCESSFUL ] com.jamesmurty.utils#java-xmlbuilder;0.4!java-xmlbuilder.jar (354ms)
downloading https://repo1.maven.org/maven2/commons-digester/commons-digester/1.8/commons-digester-1.8.jar ...
	[SUCCESSFUL ] commons-digester#commons-digester;1.8!commons-digester.jar (355ms)
downloading https://repo1.maven.org/maven2/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar ...
	[SUCCESSFUL ] commons-beanutils#commons-beanutils-core;1.8.0!commons-beanutils-core.jar (357ms)
downloading https://repo1.maven.org/maven2/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar ...
	[SUCCESSFUL ] commons-beanutils#commons-beanutils;1.7.0!commons-beanutils.jar (356ms)
downloading https://repo1.maven.org/maven2/com/thoughtworks/paranamer/paranamer/2.7/paranamer-2.7.jar ...
	[SUCCESSFUL ] com.thoughtworks.paranamer#paranamer;2.7!paranamer.jar(bundle) (354ms)
downloading https://repo1.maven.org/maven2/org/tukaani/xz/1.5/xz-1.5.jar ...
	[SUCCESSFUL ] org.tukaani#xz;1.5!xz.jar (369ms)
downloading https://repo1.maven.org/maven2/org/apache/directory/server/apacheds-kerberos-codec/2.0.0-M15/apacheds-kerberos-codec-2.0.0-M15.jar ...
	[SUCCESSFUL ] org.apache.directory.server#apacheds-kerberos-codec;2.0.0-M15!apacheds-kerberos-codec.jar(bundle) (364ms)
downloading https://repo1.maven.org/maven2/org/apache/directory/server/apacheds-i18n/2.0.0-M15/apacheds-i18n-2.0.0-M15.jar ...
	[SUCCESSFUL ] org.apache.directory.server#apacheds-i18n;2.0.0-M15!apacheds-i18n.jar(bundle) (354ms)
downloading https://repo1.maven.org/maven2/org/apache/directory/api/api-asn1-api/1.0.0-M20/api-asn1-api-1.0.0-M20.jar ...
	[SUCCESSFUL ] org.apache.directory.api#api-asn1-api;1.0.0-M20!api-asn1-api.jar(bundle) (354ms)
downloading https://repo1.maven.org/maven2/org/apache/directory/api/api-util/1.0.0-M20/api-util-1.0.0-M20.jar ...
	[SUCCESSFUL ] org.apache.directory.api#api-util;1.0.0-M20!api-util.jar(bundle) (355ms)
downloading https://repo1.maven.org/maven2/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar ...
	[SUCCESSFUL ] javax.servlet.jsp#jsp-api;2.1!jsp-api.jar (356ms)
downloading https://repo1.maven.org/maven2/org/slf4j/slf4j-log4j12/1.7.14/slf4j-log4j12-1.7.14.jar ...
	[SUCCESSFUL ] org.slf4j#slf4j-log4j12;1.7.14!slf4j-log4j12.jar (354ms)
downloading https://repo1.maven.org/maven2/tomcat/jasper-compiler/5.5.23/jasper-compiler-5.5.23.jar ...
	[SUCCESSFUL ] tomcat#jasper-compiler;5.5.23!jasper-compiler.jar (360ms)
downloading https://repo1.maven.org/maven2/tomcat/jasper-runtime/5.5.23/jasper-runtime-5.5.23.jar ...
	[SUCCESSFUL ] tomcat#jasper-runtime;5.5.23!jasper-runtime.jar (355ms)
downloading https://repo1.maven.org/maven2/javax/servlet/jsp-api/2.0/jsp-api-2.0.jar ...
	[SUCCESSFUL ] javax.servlet#jsp-api;2.0!jsp-api.jar (355ms)
downloading https://repo1.maven.org/maven2/ant/ant/1.6.5/ant-1.6.5.jar ...
	[SUCCESSFUL ] ant#ant;1.6.5!ant.jar (371ms)
downloading https://repo1.maven.org/maven2/commons-el/commons-el/1.0/commons-el-1.0.jar ...
	[SUCCESSFUL ] commons-el#commons-el;1.0!commons-el.jar (355ms)
downloading https://repo1.maven.org/maven2/org/apache/hbase/hbase-annotations/1.1.1/hbase-annotations-1.1.1.jar ...
	[SUCCESSFUL ] org.apache.hbase#hbase-annotations;1.1.1!hbase-annotations.jar (354ms)
downloading https://repo1.maven.org/maven2/org/apache/hbase/hbase-protocol/1.1.1/hbase-protocol-1.1.1.jar ...
	[SUCCESSFUL ] org.apache.hbase#hbase-protocol;1.1.1!hbase-protocol.jar (414ms)
downloading https://repo1.maven.org/maven2/io/netty/netty-all/4.0.52.Final/netty-all-4.0.52.Final.jar ...
	[SUCCESSFUL ] io.netty#netty-all;4.0.52.Final!netty-all.jar (400ms)
downloading https://repo1.maven.org/maven2/org/jruby/jcodings/jcodings/1.0.8/jcodings-1.0.8.jar ...
	[SUCCESSFUL ] org.jruby.jcodings#jcodings;1.0.8!jcodings.jar (419ms)
downloading https://repo1.maven.org/maven2/org/jruby/joni/joni/2.1.2/joni-2.1.2.jar ...
	[SUCCESSFUL ] org.jruby.joni#joni;2.1.2!joni.jar (369ms)
downloading https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-mapreduce-client-core/2.7.2/hadoop-mapreduce-client-core-2.7.2.jar ...
	[SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-core;2.7.2!hadoop-mapreduce-client-core.jar (393ms)
downloading https://repo1.maven.org/maven2/com/github/stephenc/findbugs/findbugs-annotations/1.3.9-1/findbugs-annotations-1.3.9-1.jar ...
	[SUCCESSFUL ] com.github.stephenc.findbugs#findbugs-annotations;1.3.9-1!findbugs-annotations.jar (354ms)
downloading https://repo1.maven.org/maven2/junit/junit/4.11/junit-4.11.jar ...
	[SUCCESSFUL ] junit#junit;4.11!junit.jar (357ms)
downloading https://repo1.maven.org/maven2/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar ...
	[SUCCESSFUL ] org.hamcrest#hamcrest-core;1.3!hamcrest-core.jar (353ms)
downloading https://repo1.maven.org/maven2/javax/transaction/jta/1.1/jta-1.1.jar ...
	[SUCCESSFUL ] javax.transaction#jta;1.1!jta.jar (354ms)
downloading https://repo1.maven.org/maven2/javax/transaction/transaction-api/1.1/transaction-api-1.1.jar ...
	[SUCCESSFUL ] javax.transaction#transaction-api;1.1!transaction-api.jar (368ms)
downloading https://repo1.maven.org/maven2/com/google/inject/extensions/guice-assistedinject/3.0/guice-assistedinject-3.0.jar ...
	[SUCCESSFUL ] com.google.inject.extensions#guice-assistedinject;3.0!guice-assistedinject.jar (355ms)
downloading https://repo1.maven.org/maven2/it/unimi/dsi/fastutil/6.5.6/fastutil-6.5.6.jar ...
	[SUCCESSFUL ] it.unimi.dsi#fastutil;6.5.6!fastutil.jar (894ms)
downloading https://repo1.maven.org/maven2/org/apache/twill/twill-common/0.6.0-incubating/twill-common-0.6.0-incubating.jar ...
	[SUCCESSFUL ] org.apache.twill#twill-common;0.6.0-incubating!twill-common.jar (355ms)
downloading https://repo1.maven.org/maven2/org/apache/twill/twill-core/0.6.0-incubating/twill-core-0.6.0-incubating.jar ...
	[SUCCESSFUL ] org.apache.twill#twill-core;0.6.0-incubating!twill-core.jar (358ms)
downloading https://repo1.maven.org/maven2/org/apache/twill/twill-discovery-api/0.6.0-incubating/twill-discovery-api-0.6.0-incubating.jar ...
	[SUCCESSFUL ] org.apache.twill#twill-discovery-api;0.6.0-incubating!twill-discovery-api.jar (354ms)
downloading https://repo1.maven.org/maven2/org/apache/twill/twill-discovery-core/0.6.0-incubating/twill-discovery-core-0.6.0-incubating.jar ...
	[SUCCESSFUL ] org.apache.twill#twill-discovery-core;0.6.0-incubating!twill-discovery-core.jar (354ms)
downloading https://repo1.maven.org/maven2/org/apache/twill/twill-zookeeper/0.6.0-incubating/twill-zookeeper-0.6.0-incubating.jar ...
	[SUCCESSFUL ] org.apache.twill#twill-zookeeper;0.6.0-incubating!twill-zookeeper.jar (357ms)
downloading https://repo1.maven.org/maven2/org/apache/twill/twill-api/0.6.0-incubating/twill-api-0.6.0-incubating.jar ...
	[SUCCESSFUL ] org.apache.twill#twill-api;0.6.0-incubating!twill-api.jar (354ms)
downloading https://repo1.maven.org/maven2/org/apache/hive/hive-llap-tez/2.3.9/hive-llap-tez-2.3.9.jar ...
	[SUCCESSFUL ] org.apache.hive#hive-llap-tez;2.3.9!hive-llap-tez.jar (357ms)
downloading https://repo1.maven.org/maven2/org/antlr/ST4/4.0.4/ST4-4.0.4.jar ...
	[SUCCESSFUL ] org.antlr#ST4;4.0.4!ST4.jar (356ms)
downloading https://repo1.maven.org/maven2/org/apache/ivy/ivy/2.4.0/ivy-2.4.0.jar ...
	[SUCCESSFUL ] org.apache.ivy#ivy;2.4.0!ivy.jar (372ms)
downloading https://repo1.maven.org/maven2/org/codehaus/groovy/groovy-all/2.4.4/groovy-all-2.4.4.jar ...
	[SUCCESSFUL ] org.codehaus.groovy#groovy-all;2.4.4!groovy-all.jar (549ms)
downloading https://repo1.maven.org/maven2/stax/stax-api/1.0.1/stax-api-1.0.1.jar ...
	[SUCCESSFUL ] stax#stax-api;1.0.1!stax-api.jar (354ms)
downloading https://repo1.maven.org/maven2/net/hydromatic/eigenbase-properties/1.1.5/eigenbase-properties-1.1.5.jar ...
	[SUCCESSFUL ] net.hydromatic#eigenbase-properties;1.1.5!eigenbase-properties.jar(bundle) (356ms)
downloading https://repo1.maven.org/maven2/org/codehaus/janino/commons-compiler/2.7.6/commons-compiler-2.7.6.jar ...
	[SUCCESSFUL ] org.codehaus.janino#commons-compiler;2.7.6!commons-compiler.jar (354ms)
downloading https://repo1.maven.org/maven2/org/codehaus/janino/janino/2.7.6/janino-2.7.6.jar ...
	[SUCCESSFUL ] org.codehaus.janino#janino;2.7.6!janino.jar (363ms)
downloading https://repo1.maven.org/maven2/org/apache/hive/hive-llap-client/2.3.9/hive-llap-client-2.3.9.jar ...
	[SUCCESSFUL ] org.apache.hive#hive-llap-client;2.3.9!hive-llap-client.jar (356ms)
downloading https://repo1.maven.org/maven2/org/apache/hive/hive-llap-common/2.3.9/hive-llap-common-2.3.9.jar ...
	[SUCCESSFUL ] org.apache.hive#hive-llap-common;2.3.9!hive-llap-common.jar (358ms)
downloading https://repo1.maven.org/maven2/org/xerial/snappy/snappy-java/1.1.8.2/snappy-java-1.1.8.2.jar ...
	[SUCCESSFUL ] org.xerial.snappy#snappy-java;1.1.8.2!snappy-java.jar(bundle) (383ms)
downloading https://repo1.maven.org/maven2/org/slf4j/slf4j-api/1.7.36/slf4j-api-1.7.36.jar ...
	[SUCCESSFUL ] org.slf4j#slf4j-api;1.7.36!slf4j-api.jar (353ms)
downloading https://repo1.maven.org/maven2/com/google/code/findbugs/jsr305/3.0.2/jsr305-3.0.2.jar ...
	[SUCCESSFUL ] com.google.code.findbugs#jsr305;3.0.2!jsr305.jar (353ms)
:: resolution report :: resolve 193214ms :: artifacts dl 63795ms
	:: modules in use:
	ant#ant;1.6.5 from central in [default]
	aopalliance#aopalliance;1.0 from central in [default]
	asm#asm;3.2 from central in [default]
	asm#asm-commons;3.1 from central in [default]
	asm#asm-tree;3.1 from central in [default]
	co.cask.tephra#tephra-api;0.6.0 from central in [default]
	co.cask.tephra#tephra-core;0.6.0 from central in [default]
	co.cask.tephra#tephra-hbase-compat-1.0;0.6.0 from central in [default]
	com.github.joshelser#dropwizard-metrics-hadoop-metrics2-reporter;0.1.2 from central in [default]
	com.github.stephenc.findbugs#findbugs-annotations;1.3.9-1 from central in [default]
	com.google.code.findbugs#jsr305;3.0.2 from central in [default]
	com.google.code.gson#gson;2.2.4 from central in [default]
	com.google.guava#guava;14.0.1 from central in [default]
	com.google.inject#guice;3.0 from central in [default]
	com.google.inject.extensions#guice-assistedinject;3.0 from central in [default]
	com.google.inject.extensions#guice-servlet;3.0 from central in [default]
	com.google.protobuf#protobuf-java;2.5.0 from central in [default]
	com.jamesmurty.utils#java-xmlbuilder;0.4 from central in [default]
	com.jcraft#jsch;0.1.42 from central in [default]
	com.jolbox#bonecp;0.8.0.RELEASE from central in [default]
	com.sun.jersey#jersey-client;1.9 from central in [default]
	com.sun.jersey#jersey-core;1.14 from central in [default]
	com.sun.jersey#jersey-json;1.14 from central in [default]
	com.sun.jersey#jersey-server;1.14 from central in [default]
	com.sun.jersey.contribs#jersey-guice;1.9 from central in [default]
	com.sun.xml.bind#jaxb-impl;2.2.3-1 from central in [default]
	com.tdunning#json;1.8 from central in [default]
	com.thoughtworks.paranamer#paranamer;2.7 from central in [default]
	com.zaxxer#HikariCP;2.5.1 from central in [default]
	commons-beanutils#commons-beanutils;1.7.0 from central in [default]
	commons-beanutils#commons-beanutils-core;1.8.0 from central in [default]
	commons-cli#commons-cli;1.2 from central in [default]
	commons-codec#commons-codec;1.4 from central in [default]
	commons-collections#commons-collections;3.2.2 from central in [default]
	commons-configuration#commons-configuration;1.6 from central in [default]
	commons-dbcp#commons-dbcp;1.4 from central in [default]
	commons-digester#commons-digester;1.8 from central in [default]
	commons-el#commons-el;1.0 from central in [default]
	commons-httpclient#commons-httpclient;3.1 from central in [default]
	commons-io#commons-io;2.4 from central in [default]
	commons-lang#commons-lang;2.6 from central in [default]
	commons-logging#commons-logging;1.2 from central in [default]
	commons-net#commons-net;3.1 from central in [default]
	commons-pool#commons-pool;1.5.4 from central in [default]
	io.airlift#aircompressor;0.8 from central in [default]
	io.airlift#slice;0.29 from central in [default]
	io.dropwizard.metrics#metrics-core;3.1.0 from central in [default]
	io.dropwizard.metrics#metrics-json;3.1.0 from central in [default]
	io.dropwizard.metrics#metrics-jvm;3.1.0 from central in [default]
	io.netty#netty;3.7.0.Final from central in [default]
	io.netty#netty-all;4.0.52.Final from central in [default]
	it.unimi.dsi#fastutil;6.5.6 from central in [default]
	javax.activation#activation;1.1 from central in [default]
	javax.inject#javax.inject;1 from central in [default]
	javax.jdo#jdo-api;3.0.1 from central in [default]
	javax.mail#mail;1.4.1 from central in [default]
	javax.servlet#jsp-api;2.0 from central in [default]
	javax.servlet#servlet-api;2.5 from central in [default]
	javax.servlet.jsp#jsp-api;2.1 from central in [default]
	javax.transaction#jta;1.1 from central in [default]
	javax.transaction#transaction-api;1.1 from central in [default]
	javax.xml.bind#jaxb-api;2.2.2 from central in [default]
	javax.xml.stream#stax-api;1.0-2 from central in [default]
	javolution#javolution;5.5.1 from central in [default]
	jline#jline;2.12 from central in [default]
	joda-time#joda-time;2.8.1 from central in [default]
	junit#junit;4.11 from central in [default]
	log4j#log4j;1.2.17 from central in [default]
	net.hydromatic#eigenbase-properties;1.1.5 from central in [default]
	net.java.dev.jets3t#jets3t;0.9.0 from central in [default]
	net.sf.opencsv#opencsv;2.3 from central in [default]
	org.antlr#ST4;4.0.4 from central in [default]
	org.antlr#antlr-runtime;3.5.2 from central in [default]
	org.apache.ant#ant;1.9.1 from central in [default]
	org.apache.ant#ant-launcher;1.9.1 from central in [default]
	org.apache.avro#avro;1.8.2 from central in [default]
	org.apache.commons#commons-compress;1.9 from central in [default]
	org.apache.commons#commons-lang3;3.1 from central in [default]
	org.apache.commons#commons-math3;3.1.1 from central in [default]
	org.apache.derby#derby;10.10.2.0 from central in [default]
	org.apache.directory.api#api-asn1-api;1.0.0-M20 from central in [default]
	org.apache.directory.api#api-util;1.0.0-M20 from central in [default]
	org.apache.directory.server#apacheds-i18n;2.0.0-M15 from central in [default]
	org.apache.directory.server#apacheds-kerberos-codec;2.0.0-M15 from central in [default]
	org.apache.geronimo.specs#geronimo-annotation_1.0_spec;1.1.1 from central in [default]
	org.apache.geronimo.specs#geronimo-jaspic_1.0_spec;1.0 from central in [default]
	org.apache.geronimo.specs#geronimo-jta_1.1_spec;1.1.1 from central in [default]
	org.apache.hadoop#hadoop-annotations;2.7.2 from central in [default]
	org.apache.hadoop#hadoop-auth;2.7.2 from central in [default]
	org.apache.hadoop#hadoop-client-api;3.3.6 from central in [default]
	org.apache.hadoop#hadoop-client-runtime;3.3.6 from central in [default]
	org.apache.hadoop#hadoop-common;2.7.2 from central in [default]
	org.apache.hadoop#hadoop-mapreduce-client-core;2.7.2 from central in [default]
	org.apache.hadoop#hadoop-yarn-api;2.7.2 from central in [default]
	org.apache.hadoop#hadoop-yarn-common;2.7.2 from central in [default]
	org.apache.hadoop#hadoop-yarn-server-applicationhistoryservice;2.7.2 from central in [default]
	org.apache.hadoop#hadoop-yarn-server-common;2.7.2 from central in [default]
	org.apache.hadoop#hadoop-yarn-server-resourcemanager;2.7.2 from central in [default]
	org.apache.hadoop#hadoop-yarn-server-web-proxy;2.7.2 from central in [default]
	org.apache.hbase#hbase-annotations;1.1.1 from central in [default]
	org.apache.hbase#hbase-client;1.1.1 from central in [default]
	org.apache.hbase#hbase-protocol;1.1.1 from central in [default]
	org.apache.hive#hive-common;2.3.9 from central in [default]
	org.apache.hive#hive-exec;2.3.9 from central in [default]
	org.apache.hive#hive-llap-client;2.3.9 from central in [default]
	org.apache.hive#hive-llap-common;2.3.9 from central in [default]
	org.apache.hive#hive-llap-tez;2.3.9 from central in [default]
	org.apache.hive#hive-metastore;2.3.9 from central in [default]
	org.apache.hive#hive-serde;2.3.9 from central in [default]
	org.apache.hive#hive-service-rpc;2.3.9 from central in [default]
	org.apache.hive#hive-shims;2.3.9 from central in [default]
	org.apache.hive#hive-storage-api;2.4.0 from central in [default]
	org.apache.hive.shims#hive-shims-0.23;2.3.9 from central in [default]
	org.apache.hive.shims#hive-shims-common;2.3.9 from central in [default]
	org.apache.hive.shims#hive-shims-scheduler;2.3.9 from central in [default]
	org.apache.htrace#htrace-core;3.1.0-incubating from central in [default]
	org.apache.httpcomponents#httpclient;4.4 from central in [default]
	org.apache.httpcomponents#httpcore;4.4 from central in [default]
	org.apache.ivy#ivy;2.4.0 from central in [default]
	org.apache.logging.log4j#log4j-1.2-api;2.6.2 from central in [default]
	org.apache.logging.log4j#log4j-slf4j-impl;2.6.2 from central in [default]
	org.apache.logging.log4j#log4j-web;2.6.2 from central in [default]
	org.apache.orc#orc-core;1.3.4 from central in [default]
	org.apache.parquet#parquet-hadoop-bundle;1.8.1 from central in [default]
	org.apache.thrift#libfb303;0.9.3 from central in [default]
	org.apache.thrift#libthrift;0.9.3 from central in [default]
	org.apache.twill#twill-api;0.6.0-incubating from central in [default]
	org.apache.twill#twill-common;0.6.0-incubating from central in [default]
	org.apache.twill#twill-core;0.6.0-incubating from central in [default]
	org.apache.twill#twill-discovery-api;0.6.0-incubating from central in [default]
	org.apache.twill#twill-discovery-core;0.6.0-incubating from central in [default]
	org.apache.twill#twill-zookeeper;0.6.0-incubating from central in [default]
	org.apache.zookeeper#zookeeper;3.4.6 from central in [default]
	org.codehaus.groovy#groovy-all;2.4.4 from central in [default]
	org.codehaus.jackson#jackson-core-asl;1.9.13 from central in [default]
	org.codehaus.jackson#jackson-jaxrs;1.9.13 from central in [default]
	org.codehaus.jackson#jackson-mapper-asl;1.9.13 from central in [default]
	org.codehaus.jackson#jackson-xc;1.9.13 from central in [default]
	org.codehaus.janino#commons-compiler;2.7.6 from central in [default]
	org.codehaus.janino#janino;2.7.6 from central in [default]
	org.codehaus.jettison#jettison;1.1 from central in [default]
	org.datanucleus#datanucleus-api-jdo;4.2.4 from central in [default]
	org.datanucleus#datanucleus-core;4.1.17 from central in [default]
	org.datanucleus#datanucleus-rdbms;4.1.19 from central in [default]
	org.datanucleus#javax.jdo;3.2.0-m3 from central in [default]
	org.eclipse.jetty.aggregate#jetty-all;7.6.0.v20120127 from central in [default]
	org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016 from central in [default]
	org.fusesource.leveldbjni#leveldbjni-all;1.8 from central in [default]
	org.hamcrest#hamcrest-core;1.3 from central in [default]
	org.jruby.jcodings#jcodings;1.0.8 from central in [default]
	org.jruby.joni#joni;2.1.2 from central in [default]
	org.mortbay.jetty#jetty;6.1.26 from central in [default]
	org.mortbay.jetty#jetty-util;6.1.26 from central in [default]
	org.openjdk.jol#jol-core;0.2 from central in [default]
	org.slf4j#slf4j-api;1.7.36 from central in [default]
	org.slf4j#slf4j-log4j12;1.7.14 from central in [default]
	org.sonatype.sisu.inject#cglib;2.2.1-v20090111 from central in [default]
	org.tukaani#xz;1.5 from central in [default]
	org.xerial.snappy#snappy-java;1.1.8.2 from central in [default]
	stax#stax-api;1.0.1 from central in [default]
	tomcat#jasper-compiler;5.5.23 from central in [default]
	tomcat#jasper-runtime;5.5.23 from central in [default]
	xmlenc#xmlenc;0.52 from central in [default]
	:: evicted modules:
	org.slf4j#slf4j-api;1.7.10 by [org.slf4j#slf4j-api;1.7.36] in [default]
	org.slf4j#slf4j-log4j12;1.6.1 by [org.slf4j#slf4j-log4j12;1.7.14] in [default]
	log4j#log4j;1.2.16 by [log4j#log4j;1.2.17] in [default]
	commons-logging#commons-logging;1.1.3 by [commons-logging#commons-logging;1.2] in [default]
	asm#asm;3.1 by [asm#asm;3.2] in [default]
	io.dropwizard.metrics#metrics-core;3.1.2 by [io.dropwizard.metrics#metrics-core;3.1.0] in [default]
	org.xerial.snappy#snappy-java;1.1.1.3 by [org.xerial.snappy#snappy-java;1.1.8.2] in [default]
	com.google.code.findbugs#jsr305;3.0.0 by [com.google.code.findbugs#jsr305;3.0.2] in [default]
	javax.servlet#servlet-api;2.4 by [javax.servlet#servlet-api;2.5] in [default]
	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.2] in [default]
	org.apache.hadoop#hadoop-auth;2.5.1 by [org.apache.hadoop#hadoop-auth;2.7.2] in [default]
	io.netty#netty;3.6.2.Final by [io.netty#netty;3.7.0.Final] in [default]
	com.google.code.findbugs#jsr305;2.0.1 by [com.google.code.findbugs#jsr305;3.0.0] in [default]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |  176  |  168  |  168  |   13  ||  164  |  164  |
	---------------------------------------------------------------------

:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
:: retrieving :: org.apache.spark#spark-submit-parent-d0d2962d-ae27-4526-a0c7-040a542e1e54
	confs: [default]
	164 artifacts copied, 0 already retrieved (195815kB/314ms)
+--------------------+
|           namespace|
+--------------------+
|                  a1|
|              a1_dev|
|                  a2|
|              access|
|        access_xzx01|
|        access_xzx02|
|            activity|
|                 ...|
+--------------------+
only showing top 20 rows

scala>
```

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #42599 from pan3793/unshare-guava.

Authored-by: Cheng Pan <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
dongjoon-hyun pushed a commit that referenced this pull request Sep 12, 2024
### What changes were proposed in this pull request?

This PR upgrades Spark's built-in Guava from 14 to 33.2.1-jre

Currently, Spark uses Guava 14 because the previous built-in Hive 2.3.9 is incompatible with new Guava versions. HIVE-27560 (apache/hive#4542) makes Hive 2.3.10 compatible with Guava 14+ (thanks to LuciferYang)

### Why are the changes needed?

It's a long-standing issue, see prior discussions at #35584, #36231, and #33989

### Does this PR introduce _any_ user-facing change?

Yes, some user-faced error messages changed.

### How was this patch tested?

GA passed.

Closes #42493 from pan3793/guava.

Authored-by: Cheng Pan <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
attilapiros pushed a commit to attilapiros/spark that referenced this pull request Oct 4, 2024
### What changes were proposed in this pull request?

This PR upgrades Spark's built-in Guava from 14 to 33.2.1-jre

Currently, Spark uses Guava 14 because the previous built-in Hive 2.3.9 is incompatible with new Guava versions. HIVE-27560 (apache/hive#4542) makes Hive 2.3.10 compatible with Guava 14+ (thanks to LuciferYang)

### Why are the changes needed?

It's a long-standing issue, see prior discussions at apache#35584, apache#36231, and apache#33989

### Does this PR introduce _any_ user-facing change?

Yes, some user-faced error messages changed.

### How was this patch tested?

GA passed.

Closes apache#42493 from pan3793/guava.

Authored-by: Cheng Pan <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
himadripal pushed a commit to himadripal/spark that referenced this pull request Oct 19, 2024
### What changes were proposed in this pull request?

This PR upgrades Spark's built-in Guava from 14 to 33.2.1-jre

Currently, Spark uses Guava 14 because the previous built-in Hive 2.3.9 is incompatible with new Guava versions. HIVE-27560 (apache/hive#4542) makes Hive 2.3.10 compatible with Guava 14+ (thanks to LuciferYang)

### Why are the changes needed?

It's a long-standing issue, see prior discussions at apache#35584, apache#36231, and apache#33989

### Does this PR introduce _any_ user-facing change?

Yes, some user-faced error messages changed.

### How was this patch tested?

GA passed.

Closes apache#42493 from pan3793/guava.

Authored-by: Cheng Pan <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
prabhjyotsingh pushed a commit to acceldata-io/spark3 that referenced this pull request Feb 8, 2025
This PR upgrades Spark's built-in Guava from 14 to 33.2.1-jre

Currently, Spark uses Guava 14 because the previous built-in Hive 2.3.9 is incompatible with new Guava versions. HIVE-27560 (apache/hive#4542) makes Hive 2.3.10 compatible with Guava 14+ (thanks to LuciferYang)

It's a long-standing issue, see prior discussions at apache#35584, apache#36231, and apache#33989

Yes, some user-faced error messages changed.

GA passed.

Closes apache#42493 from pan3793/guava.

Authored-by: Cheng Pan <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit 1f24b2d)
prabhjyotsingh pushed a commit to acceldata-io/spark3 that referenced this pull request Feb 8, 2025
This PR upgrades Spark's built-in Guava from 14 to 33.2.1-jre

Currently, Spark uses Guava 14 because the previous built-in Hive 2.3.9 is incompatible with new Guava versions. HIVE-27560 (apache/hive#4542) makes Hive 2.3.10 compatible with Guava 14+ (thanks to LuciferYang)

It's a long-standing issue, see prior discussions at apache#35584, apache#36231, and apache#33989

Yes, some user-faced error messages changed.

GA passed.

Closes apache#42493 from pan3793/guava.

Authored-by: Cheng Pan <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit 1f24b2d)
prabhjyotsingh pushed a commit to acceldata-io/spark3 that referenced this pull request Feb 8, 2025
This PR upgrades Spark's built-in Guava from 14 to 33.2.1-jre

Currently, Spark uses Guava 14 because the previous built-in Hive 2.3.9 is incompatible with new Guava versions. HIVE-27560 (apache/hive#4542) makes Hive 2.3.10 compatible with Guava 14+ (thanks to LuciferYang)

It's a long-standing issue, see prior discussions at apache#35584, apache#36231, and apache#33989

Yes, some user-faced error messages changed.

GA passed.

Closes apache#42493 from pan3793/guava.

Authored-by: Cheng Pan <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit 1f24b2d)
prabhjyotsingh pushed a commit to acceldata-io/spark3 that referenced this pull request Feb 8, 2025
This PR upgrades Spark's built-in Guava from 14 to 33.2.1-jre

Currently, Spark uses Guava 14 because the previous built-in Hive 2.3.9 is incompatible with new Guava versions. HIVE-27560 (apache/hive#4542) makes Hive 2.3.10 compatible with Guava 14+ (thanks to LuciferYang)

It's a long-standing issue, see prior discussions at apache#35584, apache#36231, and apache#33989

Yes, some user-faced error messages changed.

GA passed.

Closes apache#42493 from pan3793/guava.

Authored-by: Cheng Pan <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit 1f24b2d)
(cherry picked from commit e5cc252)
shubhluck pushed a commit to acceldata-io/spark3 that referenced this pull request May 16, 2025
This PR upgrades Spark's built-in Guava from 14 to 33.2.1-jre

Currently, Spark uses Guava 14 because the previous built-in Hive 2.3.9 is incompatible with new Guava versions. HIVE-27560 (apache/hive#4542) makes Hive 2.3.10 compatible with Guava 14+ (thanks to LuciferYang)

It's a long-standing issue, see prior discussions at apache#35584, apache#36231, and apache#33989

Yes, some user-faced error messages changed.

GA passed.

Closes apache#42493 from pan3793/guava.

Authored-by: Cheng Pan <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit 1f24b2d)
senthh pushed a commit to acceldata-io/spark3 that referenced this pull request May 26, 2025
This PR upgrades Spark's built-in Guava from 14 to 33.2.1-jre

Currently, Spark uses Guava 14 because the previous built-in Hive 2.3.9 is incompatible with new Guava versions. HIVE-27560 (apache/hive#4542) makes Hive 2.3.10 compatible with Guava 14+ (thanks to LuciferYang)

It's a long-standing issue, see prior discussions at apache#35584, apache#36231, and apache#33989

Yes, some user-faced error messages changed.

GA passed.

Closes apache#42493 from pan3793/guava.

Authored-by: Cheng Pan <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit 1f24b2d)
shubhluck pushed a commit to acceldata-io/spark3 that referenced this pull request Sep 3, 2025
This PR upgrades Spark's built-in Guava from 14 to 33.2.1-jre

Currently, Spark uses Guava 14 because the previous built-in Hive 2.3.9 is incompatible with new Guava versions. HIVE-27560 (apache/hive#4542) makes Hive 2.3.10 compatible with Guava 14+ (thanks to LuciferYang)

It's a long-standing issue, see prior discussions at apache#35584, apache#36231, and apache#33989

Yes, some user-faced error messages changed.

GA passed.

Closes apache#42493 from pan3793/guava.

Authored-by: Cheng Pan <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit 1f24b2d)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

9 participants