-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[WiP][SPARK-18699] SQL - parsing CSV should return null for certain numeric … #16319
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…fields - Sargis Dudaklayan and Kuba Tyszko - Zest Finance
|
Can one of the admins verify this patch? |
|
Can you describe which case this PR fixes (with reproducible codes)? I think you meant https://issues.apache.org/jira/browse/SPARK-18699 |
| case _: LongType => datum.toLong | ||
| case _: ShortType => Try(datum.toShort).getOrElse(null) | ||
| case _: IntegerType => Try(datum.toInt).getOrElse(null) | ||
| case _: LongType => Try(datum.toLong).getOrElse(null) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So, I guess you meant when the actual data is empty string and nullValue is other values such as NA, they should be translated into null when the mode is PERMISSIVE in this case.
Then, this is a exact duplicate of https://issues.apache.org/jira/browse/SPARK-18699. You could resolve the current JIRA as a duplicate and then fix the title of this PR to [SPARK-18699] ... with some changes suggested there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're right, fixed the title. Thanks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do you use Try for each type here? In the JIRA, I proposed another approach (master...maropu:SPARK-18699), and I think the fix is more natural. Anyway, ISTM we do not decide yet it is worth fixing this issue...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My patch was created before I even knew about SPARK-18699 (I did have another ticket open, then learned it was a duplicate).
I do agree that your solution may be more generic and intuitive, if others agree - why wasn't it merged into main tree yet ? (it has been over 10 days since your PR).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Behavior changes are much arguable, so it basically takes much time to discuss.
|
Let's put Let me please cc @maropu and @falaki here who were in the JIRA. |
|
@HyukjinKwon Thanks for your pinging! I left some comments. |
|
Hi @kubatyszko, are you still working on this? If you are currently unable to proceed further, maybe it should be closed for now. It seems inactive for few months. |
What changes were proposed in this pull request?
CSV parser changes allowing parsing of numeric fields to fail and return null in such case.
In conjunction with "nullValue" option that may be used elsewhere this allows handling of certain csv sources that may use empty string as indication of null in one column and another specific value indicating null in another.
Currently the option "nullValue" can only be provided once and we can't assume that a data source won't have a single "null" indicator.
This problem is very similar to the one discussed here: https://github.com/databricks/spark-csv/issues/239
Sargis Dudaklayan and Kuba Tyszko - Zest Finance
How was this patch tested?
The patch was tested using freshly compiled spark version 2.0.1 on a sample data source that has "null" values in 2 columns, one specified as "NA" and set using nullValue and another column with "" indicating no integer value.
Please review http://spark.apache.org/contributing.html before opening a pull request.