Skip to content

Commit f719ccc

Browse files
zjffduFelix Cheung
authored andcommitted
[SPARK-19572][SPARKR] Allow to disable hive in sparkR shell
## What changes were proposed in this pull request? SPARK-15236 do this for scala shell, this ticket is for sparkR shell. This is not only for sparkR itself, but can also benefit downstream project like livy which use shell.R for its interactive session. For now, livy has no control of whether enable hive or not. ## How was this patch tested? Tested it manually, run `bin/sparkR --master local --conf spark.sql.catalogImplementation=in-memory` and verify hive is not enabled. Author: Jeff Zhang <[email protected]> Closes #16907 from zjffdu/SPARK-19572. (cherry picked from commit 7315880) Signed-off-by: Felix Cheung <[email protected]>
1 parent d887f75 commit f719ccc

File tree

1 file changed

+4
-2
lines changed

1 file changed

+4
-2
lines changed

sql/core/src/main/scala/org/apache/spark/sql/api/r/SQLUtils.scala

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -47,12 +47,14 @@ private[sql] object SQLUtils extends Logging {
4747
jsc: JavaSparkContext,
4848
sparkConfigMap: JMap[Object, Object],
4949
enableHiveSupport: Boolean): SparkSession = {
50-
val spark = if (SparkSession.hiveClassesArePresent && enableHiveSupport) {
50+
val spark = if (SparkSession.hiveClassesArePresent && enableHiveSupport
51+
&& jsc.sc.conf.get(CATALOG_IMPLEMENTATION.key, "hive").toLowerCase == "hive") {
5152
SparkSession.builder().sparkContext(withHiveExternalCatalog(jsc.sc)).getOrCreate()
5253
} else {
5354
if (enableHiveSupport) {
5455
logWarning("SparkR: enableHiveSupport is requested for SparkSession but " +
55-
"Spark is not built with Hive; falling back to without Hive support.")
56+
s"Spark is not built with Hive or ${CATALOG_IMPLEMENTATION.key} is not set to 'hive', " +
57+
"falling back to without Hive support.")
5658
}
5759
SparkSession.builder().sparkContext(jsc.sc).getOrCreate()
5860
}

0 commit comments

Comments
 (0)