Skip to content

Commit c419b93

Browse files
committed
Code review
1 parent 4debc83 commit c419b93

File tree

4 files changed

+11
-11
lines changed

4 files changed

+11
-11
lines changed

docs/sql-ref-ansi-compliance.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ The casting behaviours are defined as store assignment rules in the standard.
2828
When `spark.sql.storeAssignmentPolicy` is set to `ANSI`, Spark SQL complies with the ANSI store assignment rules. This is a separate configuration because its default value is `ANSI`, while the configuration `spark.sql.ansi.enabled` is disabled by default.
2929

3030
|Property Name|Default|Meaning|Since Version|
31-
|--- |--- |--- |--- |
31+
|-------------|-------|-------|-------------|
3232
|`spark.sql.ansi.enabled`|false|(Experimental) When true, Spark tries to conform to the ANSI SQL specification: <br> 1. Spark will throw a runtime exception if an overflow occurs in any operation on integral/decimal field. <br> 2. Spark will forbid using the reserved keywords of ANSI SQL as identifiers in the SQL parser.|3.0.0|
3333
|`spark.sql.storeAssignmentPolicy`|ANSI|(Experimental) When inserting a value into a column with different data type, Spark will perform type coercion. Currently, we support 3 policies for the type coercion rules: ANSI, legacy and strict. With ANSI policy, Spark performs the type coercion as per ANSI SQL. In practice, the behavior is mostly the same as PostgreSQL. It disallows certain unreasonable type conversions such as converting string to int or double to boolean. With legacy policy, Spark allows the type coercion as long as it is a valid Cast, which is very loose. e.g. converting string to int or double to boolean is allowed. It is also the only behavior in Spark 2.x and it is compatible with Hive. With strict policy, Spark doesn't allow any possible precision loss or data truncation in type coercion, e.g. converting double to int or decimal to double is not allowed.|3.0.0|
3434

@@ -128,7 +128,7 @@ By default `spark.sql.ansi.enabled` is false.
128128
Below is a list of all the keywords in Spark SQL.
129129

130130
|Keyword|Spark SQL<br>ANSI Mode|Spark SQL<br>Default Mode|SQL-2011|
131-
|--- |--- |--- |--- |
131+
|-------|----------------------|-------------------------|--------|
132132
|ADD|non-reserved|non-reserved|non-reserved|
133133
|AFTER|non-reserved|non-reserved|non-reserved|
134134
|ALL|reserved|non-reserved|reserved|

docs/sql-ref-datatypes.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,7 @@ You can access them by doing
7272
{% include_example data_types scala/org/apache/spark/examples/sql/SparkSQLExample.scala %}
7373

7474
|Data type|Value type in Scala|API to access or create a data type|
75-
|--- |--- |--- |
75+
|---------|-------------------|-----------------------------------|
7676
|**ByteType**|Byte|ByteType|
7777
|**ShortType**|Short|ShortType|
7878
|**IntegerType**|Int|IntegerType|
@@ -100,7 +100,7 @@ please use factory methods provided in
100100
`org.apache.spark.sql.types.DataTypes`.
101101

102102
|Data type|Value type in Java|API to access or create a data type|
103-
|--- |--- |--- |
103+
|---------|------------------|-----------------------------------|
104104
|**ByteType**|byte or Byte|DataTypes.ByteType|
105105
|**ShortType**|short or Short|DataTypes.ShortType|
106106
|**IntegerType**|int or Integer|DataTypes.IntegerType|
@@ -129,7 +129,7 @@ from pyspark.sql.types import *
129129
{% endhighlight %}
130130

131131
|Data type|Value type in Python|API to access or create a data type|
132-
|--- |--- |--- |
132+
|---------|--------------------|-----------------------------------|
133133
|**ByteType**|int or long<br>**Note:** Numbers will be converted to 1-byte signed integer numbers at runtime. Please make sure that numbers are within the range of -128 to 127.|ByteType()|
134134
|**ShortType**|int or long<br>**Note:** Numbers will be converted to 2-byte signed integer numbers at runtime. Please make sure that numbers are within the range of -32768 to 32767.|ShortType()|
135135
|**IntegerType**|int or long|IntegerType()|
@@ -151,7 +151,7 @@ from pyspark.sql.types import *
151151

152152
<div data-lang="r" markdown="1">
153153
|Data type|Value type in R|API to access or create a data type|
154-
|--- |--- |--- |
154+
|---------|---------------|-----------------------------------|
155155
|**ByteType**|integer <br>**Note:** Numbers will be converted to 1-byte signed integer numbers at runtime. Please make sure that numbers are within the range of -128 to 127.|"byte"|
156156
|**ShortType**|integer <br>**Note:** Numbers will be converted to 2-byte signed integer numbers at runtime. Please make sure that numbers are within the range of -32768 to 32767.|"short"|
157157
|**IntegerType**|integer|"integer"|
@@ -176,7 +176,7 @@ from pyspark.sql.types import *
176176
The following table shows the type names as well as aliases used in Spark SQL parser for each data type.
177177

178178
|Data type|SQL name|
179-
|--- |--- |
179+
|---------|--------|
180180
|**BooleanType**|BOOLEAN|
181181
|**ByteType**|BYTE, TINYINT|
182182
|**ShortType**|SHORT, SMALLINT|

docs/sql-ref-datetime-pattern.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ There are several common scenarios for datetime usage in Spark:
2929
Spark uses pattern letters in the following table for date and timestamp parsing and formatting:
3030

3131
|Symbol|Meaning|Presentation|Examples|
32-
|--- |--- |--- |--- |
32+
|------|-------|------------|--------|
3333
|**G**|era|text|AD; Anno Domini; A|
3434
|**y**|year|year|2020; 20|
3535
|**D**|day-of-year|number|189|
@@ -58,8 +58,8 @@ Spark uses pattern letters in the following table for date and timestamp parsing
5858
|**Z**|zone-offset|offset-Z|+0000; -0800; -08:00;|
5959
|**'**|escape for text|delimiter||
6060
|**''**|single quote|literal|'|
61-
|**[**|optional section start|||
62-
|**]**|optional section end|||
61+
|**[**|optional section start| | |
62+
|**]**|optional section end| | |
6363

6464
The count of pattern letters determines the format.
6565

docs/sql-ref-syntax-qry-select-tvf.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ function_name ( expression [ , ... ] ) [ table_alias ]
5151
### Supported Table-valued Functions
5252

5353
|Function|Argument Type(s)|Description|
54-
|--- |--- |--- |
54+
|--------|----------------|-----------|
5555
|**range** ( end )|Long|Creates a table with a single *LongType* column named *id*, containing<br> rows in a range from 0 to *end* (exclusive) with step value 1.|
5656
|**range** ( start, end )|Long, Long|Creates a table with a single *LongType* column named *id*, containing<br> rows in a range from *start* to *end* (exclusive) with step value 1.|
5757
|**range** ( start, end, step )|Long, Long, Long|Creates a table with a single *LongType* column named *id*, containing<br> rows in a range from *start* to *end* (exclusive) with *step* value.|

0 commit comments

Comments
 (0)