Skip to content

Commit 15000a2

Browse files
committed
Rename.
1 parent 021134c commit 15000a2

File tree

3 files changed

+6
-5
lines changed

3 files changed

+6
-5
lines changed

docs/sql-migration-guide-upgrade.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ displayTitle: Spark SQL Upgrading Guide
2929

3030
- In Spark version 2.4 and earlier, users can create a map with duplicated keys via built-in functions like `CreateMap`, `StringToMap`, etc. The behavior of map with duplicated keys is undefined, e.g. map look up respects the duplicated key appears first, `Dataset.collect` only keeps the duplicated key appears last, `MapKeys` returns duplicated keys, etc. Since Spark 3.0, these built-in functions will remove duplicated map keys with last wins policy. Users may still read map values with duplicated keys from data sources which do not enforce it (e.g. Parquet), the behavior will be udefined.
3131

32-
- In Spark version 2.4 and earlier, the `SET` command works without any warnings even if the specified key is for `SparkConf` entries and it has no effect because the command does not update `SparkConf`, but the behavior might confuse users. Since 3.0, the command fails if a `SparkConf` key is used. You can disable such a check by setting `spark.sql.legacy.setCommandRejectsSparkConfs` to `false`.
32+
- In Spark version 2.4 and earlier, the `SET` command works without any warnings even if the specified key is for `SparkConf` entries and it has no effect because the command does not update `SparkConf`, but the behavior might confuse users. Since 3.0, the command fails if a `SparkConf` key is used. You can disable such a check by setting `spark.sql.legacy.setCommandRejectsSparkCoreConfs` to `false`.
3333

3434
- Spark applications which are built with Spark version 2.4 and prior, and call methods of `UserDefinedFunction`, need to be re-compiled with Spark 3.0, as they are not binary compatible with Spark 3.0.
3535

sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1611,8 +1611,8 @@ object SQLConf {
16111611
.intConf
16121612
.createWithDefault(25)
16131613

1614-
val SET_COMMAND_REJECTS_SPARK_CONFS =
1615-
buildConf("spark.sql.legacy.setCommandRejectsSparkConfs")
1614+
val SET_COMMAND_REJECTS_SPARK_CORE_CONFS =
1615+
buildConf("spark.sql.legacy.setCommandRejectsSparkCoreConfs")
16161616
.internal()
16171617
.doc("If it is set to true, SET command will fail when the key is registered as " +
16181618
"a SparkConf entry.")
@@ -2045,7 +2045,8 @@ class SQLConf extends Serializable with Logging {
20452045

20462046
def maxToStringFields: Int = getConf(SQLConf.MAX_TO_STRING_FIELDS)
20472047

2048-
def setCommandRejectsSparkConfs: Boolean = getConf(SQLConf.SET_COMMAND_REJECTS_SPARK_CONFS)
2048+
def setCommandRejectsSparkCoreConfs: Boolean =
2049+
getConf(SQLConf.SET_COMMAND_REJECTS_SPARK_CORE_CONFS)
20492050

20502051
def legacyTimeParserEnabled: Boolean = getConf(SQLConf.LEGACY_TIME_PARSER_ENABLED)
20512052

sql/core/src/main/scala/org/apache/spark/sql/RuntimeConfig.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -153,7 +153,7 @@ class RuntimeConfig private[sql](sqlConf: SQLConf = new SQLConf) {
153153
if (SQLConf.staticConfKeys.contains(key)) {
154154
throw new AnalysisException(s"Cannot modify the value of a static config: $key")
155155
}
156-
if (sqlConf.setCommandRejectsSparkConfs &&
156+
if (sqlConf.setCommandRejectsSparkCoreConfs &&
157157
ConfigEntry.findEntry(key) != null && !SQLConf.sqlConfEntries.containsKey(key)) {
158158
throw new AnalysisException(s"Cannot modify the value of a Spark config: $key")
159159
}

0 commit comments

Comments
 (0)