Skip to content
This repository was archived by the owner on May 9, 2024. It is now read-only.

Commit 0398f5b

Browse files
committed
First populate the SQLConf and then construct executionHive and metadataHive.
1 parent 9126ea4 commit 0398f5b

File tree

1 file changed

+22
-3
lines changed

1 file changed

+22
-3
lines changed

sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala

Lines changed: 22 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -182,9 +182,28 @@ class SQLContext(@transient val sparkContext: SparkContext)
182182
conf.dialect
183183
}
184184

185-
sparkContext.getConf.getAll.foreach {
186-
case (key, value) if key.startsWith("spark.sql") => setConf(key, value)
187-
case _ =>
185+
{
186+
// We extract spark sql settings from SparkContext's conf and put them to
187+
// Spark SQL's conf.
188+
// First, we populate the SQLConf (conf). So, we can make sure that other values using
189+
// those settings in their construction can get the correct settings.
190+
// For example, metadataHive in HiveContext may need both spark.sql.hive.metastore.version
191+
// and spark.sql.hive.metastore.jars to get correctly constructed.
192+
val properties = new Properties
193+
sparkContext.getConf.getAll.foreach {
194+
case (key, value) if key.startsWith("spark.sql") => properties.setProperty(key, value)
195+
case _ =>
196+
}
197+
// We directly put those settings to conf to avoid of calling setConf, which may have
198+
// side-effects. For example, in HiveContext, setConf may cause executionHive and metadataHive
199+
// get constructed. If we call setConf directly, the constructed metadataHive may have
200+
// wrong settings, or the construction may fail.
201+
conf.setConf(properties)
202+
// After we have populated SQLConf, we call setConf to populate other confs in the subclass
203+
// (e.g. hiveconf in HiveContext).
204+
properties.foreach {
205+
case (key, value) => setConf(key, value)
206+
}
188207
}
189208

190209
@transient

0 commit comments

Comments
 (0)