Skip to content

Commit 76c697e

Browse files
committed
revert
1 parent ea13c5a commit 76c697e

File tree

4 files changed

+14
-17
lines changed

4 files changed

+14
-17
lines changed

docs/sql-ref-ansi-compliance.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -255,7 +255,6 @@ The behavior of some SQL functions can be different under ANSI mode (`spark.sql.
255255
The behavior of some SQL operators can be different under ANSI mode (`spark.sql.ansi.enabled=true`).
256256
- `array_col[index]`: This operator throws `ArrayIndexOutOfBoundsException` if using invalid indices.
257257
- `map_col[key]`: This operator throws `NoSuchElementException` if key does not exist in map.
258-
- `GROUP BY`: aliases in a select list can not be used in GROUP BY clauses. Each column referenced in a GROUP BY clause shall unambiguously reference a column of the table resulting from the FROM clause.
259258

260259
### Useful Functions for ANSI Mode
261260

sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1948,7 +1948,7 @@ class Analyzer(override val catalogManager: CatalogManager)
19481948
// mayResolveAttrByAggregateExprs requires the TreePattern UNRESOLVED_ATTRIBUTE.
19491949
_.containsAllPatterns(AGGREGATE, UNRESOLVED_ATTRIBUTE), ruleId) {
19501950
case agg @ Aggregate(groups, aggs, child)
1951-
if allowGroupByAlias && child.resolved && aggs.forall(_.resolved) &&
1951+
if conf.groupByAliases && child.resolved && aggs.forall(_.resolved) &&
19521952
groups.exists(!_.resolved) =>
19531953
agg.copy(groupingExpressions = mayResolveAttrByAggregateExprs(groups, aggs, child))
19541954
}

sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala

Lines changed: 13 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -240,17 +240,6 @@ object SQLConf {
240240
.intConf
241241
.createWithDefault(100)
242242

243-
val ANSI_ENABLED = buildConf("spark.sql.ansi.enabled")
244-
.doc("When true, Spark SQL uses an ANSI compliant dialect instead of being Hive compliant. " +
245-
"For example, Spark will throw an exception at runtime instead of returning null results " +
246-
"when the inputs to a SQL operator/function are invalid." +
247-
"For full details of this dialect, you can find them in the section \"ANSI Compliance\" of " +
248-
"Spark's documentation. Some ANSI dialect features may be not from the ANSI SQL " +
249-
"standard directly, but their behaviors align with ANSI SQL's style")
250-
.version("3.0.0")
251-
.booleanConf
252-
.createWithDefault(false)
253-
254243
val OPTIMIZER_EXCLUDED_RULES = buildConf("spark.sql.optimizer.excludedRules")
255244
.doc("Configures a list of rules to be disabled in the optimizer, in which the rules are " +
256245
"specified by their rule names and separated by comma. It is not guaranteed that all the " +
@@ -1221,9 +1210,8 @@ object SQLConf {
12211210
.createWithDefault(true)
12221211

12231212
val GROUP_BY_ALIASES = buildConf("spark.sql.groupByAliases")
1224-
.doc("This configuration is only effective when ANSI mode is disabled. When it is true and " +
1225-
s"${ANSI_ENABLED.key} is false, aliases in a select list can be used in group by clauses. " +
1226-
"Otherwise, an analysis exception is thrown in the case.")
1213+
.doc("When true, aliases in a select list can be used in group by clauses. When false, " +
1214+
"an analysis exception is thrown in the case.")
12271215
.version("2.2.0")
12281216
.booleanConf
12291217
.createWithDefault(true)
@@ -2547,6 +2535,17 @@ object SQLConf {
25472535
.checkValues(StoreAssignmentPolicy.values.map(_.toString))
25482536
.createWithDefault(StoreAssignmentPolicy.ANSI.toString)
25492537

2538+
val ANSI_ENABLED = buildConf("spark.sql.ansi.enabled")
2539+
.doc("When true, Spark SQL uses an ANSI compliant dialect instead of being Hive compliant. " +
2540+
"For example, Spark will throw an exception at runtime instead of returning null results " +
2541+
"when the inputs to a SQL operator/function are invalid." +
2542+
"For full details of this dialect, you can find them in the section \"ANSI Compliance\" of " +
2543+
"Spark's documentation. Some ANSI dialect features may be not from the ANSI SQL " +
2544+
"standard directly, but their behaviors align with ANSI SQL's style")
2545+
.version("3.0.0")
2546+
.booleanConf
2547+
.createWithDefault(false)
2548+
25502549
val SORT_BEFORE_REPARTITION =
25512550
buildConf("spark.sql.execution.sortBeforeRepartition")
25522551
.internal()

sql/core/src/test/resources/sql-tests/inputs/ansi/group-analytics.sql

Lines changed: 0 additions & 1 deletion
This file was deleted.

0 commit comments

Comments
 (0)