-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-8945][SQL] Add add and subtract expressions for IntervalType #7398
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Test build #37240 has finished for PR 7398 at commit
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this seems excessive, since it won't work for multiply or division.
|
Let's revisit this patch once #7348 is merged. |
|
Test build #37301 has finished for PR 7398 at commit
|
|
A failure about modified error message. I will update soon. |
|
Test build #37316 has finished for PR 7398 at commit
|
|
There are some conflicts since #7348 is merged. I will update later. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now #7348 is in, just create a TypeCollection that contains all the numeric types as well as interval type.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok. I will update again.
…ract Conflicts: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/TypeUtils.scala
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i don't think you want this -- it is super slow. Just remove this function, and define codegen in Add / Subtract, rather than relying on what BinaryArithmetic provides.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok. I wanted to avoid defining codegen in two operations. Update later together.
…ract Conflicts: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/ExpressionTypeCheckingSuite.scala
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just inline $decimalMethod here; don't define the function anymore.
|
LGTM other than that. |
|
LGTM |
|
Test build #37330 has finished for PR 7398 at commit
|
|
Test build #37337 has finished for PR 7398 at commit
|
|
Test build #37342 has finished for PR 7398 at commit
|
|
Looks like an unrelated failure. |
|
Test build #37339 has finished for PR 7398 at commit
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we simplify it to TypeCollection(NumericType, IntervalType)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because TypeCollection also specifies precedence for types, I am not sure if it works when we use NumericType here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually we should use NumericType here.
Think about 2 + "2", before this PR, we will first cast "2" to 2.0(double is the default for numeric type), and then the result should be double 4.0. After this PR, we will first cast "2" to 2(byte is the default for your type collection), and thus make the result of type int.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK. Another problem that might be minor is that the casting error message will become argument 1 is expected to be of type (numeric or interval), instead of ...type (tinyint or smallint or int or bigint or float or double or decimal or interval....
Is it still informative? Because supposed that NumericType should be internal? Will users know what type is numeric meaning?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I saw there are already use cases for directly showing numeric in error message. So I updated this too.
|
lgtm |
…ract Conflicts: sql/catalyst/src/main/scala/org/apache/spark/sql/types/AbstractDataType.scala
|
Test build #37485 has finished for PR 7398 at commit
|
|
unrelated failure. |
|
please retest this. |
|
Jenkins, retest this please. |
|
Test build #37608 has finished for PR 7398 at commit
|
|
Thanks - I merged it and resolved the conflict. |
JIRA: https://issues.apache.org/jira/browse/SPARK-8945
Add add and subtract expressions for IntervalType.