Skip to content

Commit 0eeafdc

Browse files
committed
fix
1 parent a784a9b commit 0eeafdc

File tree

1 file changed

+7
-7
lines changed

1 file changed

+7
-7
lines changed

sql/core/src/test/resources/sql-tests/results/explain.sql.out

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -55,22 +55,22 @@ struct<plan:string>
5555

5656
== Analyzed Logical Plan ==
5757
sum(DISTINCT val): bigint
58-
Aggregate [sum(distinct cast(val#x as bigint)) AS sum(DISTINCT val)#x]
58+
Aggregate [sum(distinct cast(val#x as bigint)) AS sum(DISTINCT val)#xL]
5959
+- SubqueryAlias spark_catalog.default.explain_temp1
6060
+- Relation[key#x,val#x] parquet
6161

6262
== Optimized Logical Plan ==
63-
Aggregate [sum(distinct cast(val#x as bigint)) AS sum(DISTINCT val)#x]
63+
Aggregate [sum(distinct cast(val#x as bigint)) AS sum(DISTINCT val)#xL]
6464
+- Project [val#x]
6565
+- Relation[key#x,val#x] parquet
6666

6767
== Physical Plan ==
68-
*(3) HashAggregate(keys=[], functions=[sum(distinct cast(val#x as bigint)#x)], output=[sum(DISTINCT val)#x])
68+
*(3) HashAggregate(keys=[], functions=[sum(distinct cast(val#x as bigint)#xL)], output=[sum(DISTINCT val)#xL])
6969
+- Exchange SinglePartition, true, [id=#x]
70-
+- *(2) HashAggregate(keys=[], functions=[partial_sum(distinct cast(val#x as bigint)#x)], output=[sum#x])
71-
+- *(2) HashAggregate(keys=[cast(val#x as bigint)#x], functions=[], output=[cast(val#x as bigint)#x])
72-
+- Exchange hashpartitioning(cast(val#x as bigint)#x, 200), true, [id=#x]
73-
+- *(1) HashAggregate(keys=[cast(val#x as bigint) AS cast(val#x as bigint)#x], functions=[], output=[cast(val#x as bigint)#x])
70+
+- *(2) HashAggregate(keys=[], functions=[partial_sum(distinct cast(val#x as bigint)#xL)], output=[sum#xL])
71+
+- *(2) HashAggregate(keys=[cast(val#x as bigint)#xL], functions=[], output=[cast(val#x as bigint)#xL])
72+
+- Exchange hashpartitioning(cast(val#x as bigint)#xL, 200), true, [id=#x]
73+
+- *(1) HashAggregate(keys=[cast(val#x as bigint) AS cast(val#x as bigint)#xL], functions=[], output=[cast(val#x as bigint)#xL])
7474
+- *(1) ColumnarToRow
7575
+- FileScan parquet default.explain_temp1[val#x] Batched: true, DataFilters: [], Format: Parquet, Location: InMemoryFileIndex[[not included in comparison]/{warehouse_dir}/explain_temp1], PartitionFilters: [], PushedFilters: [], ReadSchema: struct<val:int>
7676

0 commit comments

Comments
 (0)