Skip to content

Commit dd49ea3

Browse files
committed
Add the legacy prefix.
1 parent 96b2280 commit dd49ea3

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

docs/sql-migration-guide-upgrade.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ displayTitle: Spark SQL Upgrading Guide
2929

3030
- In Spark version 2.4 and earlier, users can create a map with duplicated keys via built-in functions like `CreateMap`, `StringToMap`, etc. The behavior of map with duplicated keys is undefined, e.g. map look up respects the duplicated key appears first, `Dataset.collect` only keeps the duplicated key appears last, `MapKeys` returns duplicated keys, etc. Since Spark 3.0, these built-in functions will remove duplicated map keys with last wins policy. Users may still read map values with duplicated keys from data sources which do not enforce it (e.g. Parquet), the behavior will be udefined.
3131

32-
- In Spark version 2.4 and earlier, the `SET` command works without any warnings even if the specified key is for `SparkConf` entries and it has no effect because the command does not update `SparkConf`, but the behavior might confuse users. Since 3.0, the command fails if a `SparkConf` key is used. You can disable such a check by setting `spark.sql.execution.setCommandRejectsSparkConfs` to `false`.
32+
- In Spark version 2.4 and earlier, the `SET` command works without any warnings even if the specified key is for `SparkConf` entries and it has no effect because the command does not update `SparkConf`, but the behavior might confuse users. Since 3.0, the command fails if a `SparkConf` key is used. You can disable such a check by setting `spark.sql.legacy.execution.setCommandRejectsSparkConfs` to `false`.
3333

3434
## Upgrading From Spark SQL 2.3 to 2.4
3535

sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1612,7 +1612,7 @@ object SQLConf {
16121612
.createWithDefault(25)
16131613

16141614
val SET_COMMAND_REJECTS_SPARK_CONFS =
1615-
buildConf("spark.sql.execution.setCommandRejectsSparkConfs")
1615+
buildConf("spark.sql.legacy.execution.setCommandRejectsSparkConfs")
16161616
.internal()
16171617
.doc("If it is set to true, SET command will fail when the key is registered as " +
16181618
"a SparkConf entry.")

0 commit comments

Comments
 (0)