Skip to content

Commit f16b6ea

Browse files
committed
comment address
1 parent d357a9a commit f16b6ea

File tree

4 files changed

+12
-11
lines changed

4 files changed

+12
-11
lines changed

docs/sql-keywords.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -19,12 +19,13 @@ license: |
1919
limitations under the License.
2020
---
2121

22-
When `spark.sql.dialect.spark.ansi.enabled` is true, Spark SQL has two kinds of keywords:
22+
When `spark.sql.dialect=PostgreSQL` or keep default `spark.sql.dialect=Spark` with setting `spark.sql.dialect.spark.ansi.enabled` to true, Spark SQL will use the ANSI mode parser.
23+
In this mode, Spark SQL has two kinds of keywords:
2324
* Reserved keywords: Keywords that are reserved and can't be used as identifiers for table, view, column, function, alias, etc.
2425
* Non-reserved keywords: Keywords that have a special meaning only in particular contexts and can be used as identifiers in other contexts. For example, `SELECT 1 WEEK` is an interval literal, but WEEK can be used as identifiers in other places.
2526

26-
When `spark.sql.dialect.spark.ansi.enabled` is false, Spark SQL has two kinds of keywords:
27-
* Non-reserved keywords: Same definition as the one when `spark.sql.dialect.spark.ansi.enabled=true`.
27+
When the ANSI mode is disabled, Spark SQL has two kinds of keywords:
28+
* Non-reserved keywords: Same definition as the one when the ANSI mode enabled.
2829
* Strict-non-reserved keywords: A strict version of non-reserved keywords, which can not be used as table alias.
2930

3031
By default `spark.sql.dialect.spark.ansi.enabled` is false.

sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -957,8 +957,7 @@ number
957957
| MINUS? BIGDECIMAL_LITERAL #bigDecimalLiteral
958958
;
959959

960-
// When we use PostgreSQL dialect or use Spark dialect with
961-
// `spark.sql.dialect.spark.ansi.enabled=true`, there are 2 kinds of keywords in Spark SQL.
960+
// When `use_SQL_standard_keywords=true`, there are 2 kinds of keywords in Spark SQL.
962961
// - Reserved keywords:
963962
// Keywords that are reserved and can't be used as identifiers for table, view, column,
964963
// function, alias, etc.
@@ -1158,10 +1157,9 @@ ansiNonReserved
11581157
| YEARS
11591158
;
11601159

1161-
// When we use Spark dialect with `spark.sql.dialect.spark.ansi.enabled=false`,
1162-
// there are 2 kinds of keywords in Spark SQL.
1160+
// When `use_SQL_standard_keywords=false`, there are 2 kinds of keywords in Spark SQL.
11631161
// - Non-reserved keywords:
1164-
// Same definition as the one when the ANSI mode enabled.
1162+
// Same definition as the one when `use_SQL_standard_keywords=true`.
11651163
// - Strict-non-reserved keywords:
11661164
// A strict version of non-reserved keywords, which can not be used as table alias.
11671165
// You can find the full keywords list by searching "Start of the keywords list" in this file.

sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/ParseDriver.scala

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -89,7 +89,9 @@ abstract class AbstractSqlParser(conf: SQLConf) extends ParserInterface with Log
8989
protected def parse[T](command: String)(toResult: SqlBaseParser => T): T = {
9090
logDebug(s"Parsing command: $command")
9191

92-
val useSQLStandardKeywords = Dialect.withName(conf.dialect) match {
92+
// When we use PostgreSQL dialect or use Spark dialect with setting
93+
// `spark.sql.dialect.spark.ansi.enabled=true`, the parser will use ANSI SQL standard keywords.
94+
val useSQLStandardKeywords = conf.dialect match {
9395
case Dialect.POSTGRESQL => true
9496
case Dialect.SPARK => conf.dialectSparkAnsiEnabled
9597
}

sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2515,9 +2515,9 @@ class SQLConf extends Serializable with Logging {
25152515

25162516
def intervalOutputStyle: IntervalStyle.Value = IntervalStyle.withName(getConf(INTERVAL_STYLE))
25172517

2518-
def dialect: String = getConf(DIALECT)
2518+
def dialect: Dialect.Value = Dialect.withName(getConf(DIALECT))
25192519

2520-
def usePostgreSQLDialect: Boolean = dialect == Dialect.POSTGRESQL.toString
2520+
def usePostgreSQLDialect: Boolean = dialect == Dialect.POSTGRESQL
25212521

25222522
def dialectSparkAnsiEnabled: Boolean = getConf(DIALECT_SPARK_ANSI_ENABLED)
25232523

0 commit comments

Comments
 (0)