Skip to content

Conversation

@cloud-fan
Copy link
Contributor

What changes were proposed in this pull request?

remove the leading "interval" in CalendarInterval.toString.

Why are the changes needed?

Although it's allowed to have "interval" prefix when casting string to int, it's not recommended.

This is also consistent with pgsql:

cloud0fan=# select interval '1' day;
 interval 
----------
 1 day
(1 row)

Does this PR introduce any user-facing change?

yes, when display a dataframe with interval type column, the result is different.

How was this patch tested?

updated tests.

@cloud-fan
Copy link
Contributor Author

cc @MaxGekk @dongjoon-hyun @maropu

@maropu
Copy link
Member

maropu commented Nov 5, 2019

Ur, I see, this change looks reasonable. Actually, it seems pgSQL does so;

postgres=# select cast('1 day' as interval);
 interval 
----------
 1 day
(1 row)

@yaooqinn
Copy link
Member

yaooqinn commented Nov 5, 2019

I have been confusing this weird prefix for a long time,but how about we use sql standard output style?

@cloud-fan
Copy link
Contributor Author

AFAIK pgsql has configs to change the interval output string. We can think about it later.

@SparkQA
Copy link

SparkQA commented Nov 5, 2019

Test build #113271 has finished for PR 26401 at commit 5fb2fe6.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Nov 5, 2019

Test build #113273 has finished for PR 26401 at commit e08c60b.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

private void appendUnit(StringBuilder sb, long value, String unit) {
if (value != 0) {
sb.append(' ').append(value).append(' ').append(unit).append('s');
sb.append(value).append(' ').append(unit).append('s').append(' ');
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: Less .append() calls:

Suggested change
sb.append(value).append(' ').append(unit).append('s').append(' ');
sb.append(value).append(' ').append(unit).append("s ");

}

sb.setLength(sb.length() - 1);
return sb.toString();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not important but you could leave the code as is, and just do trim() on the result.

@SparkQA
Copy link

SparkQA commented Nov 5, 2019

Test build #113278 has finished for PR 26401 at commit dc094a7.

  • This patch passes all tests.
  • This patch does not merge cleanly.
  • This patch adds no public classes.

@dongjoon-hyun
Copy link
Member

Thank you for pinging me, @cloud-fan . I'm +1 for this, but it seems that we need a migration guide in order to avoid surprises because the output change might affect the old Spark application which expects that prefix. (cc @gatorsmile )

This PR already shows that kind of side effects.

- df.selectExpr(s"d - $i"),
+ df.selectExpr(s"d - INTERVAL'$i'"),

@cloud-fan
Copy link
Contributor Author

This PR already shows that kind of side effects.

The particular case you pointed out is not a breaking change. It relies on CalendarInterval.toString which is a private class.

it does change the result when casting interval to string. Let me add a migration guide in case users rely on that string.

@SparkQA
Copy link

SparkQA commented Nov 6, 2019

Test build #113297 has finished for PR 26401 at commit d68d974.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@cloud-fan
Copy link
Contributor Author

retest this please

@SparkQA
Copy link

SparkQA commented Nov 6, 2019

Test build #113303 has finished for PR 26401 at commit d68d974.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

@cloud-fan
Copy link
Contributor Author

retest this please

@SparkQA
Copy link

SparkQA commented Nov 6, 2019

Test build #113306 has finished for PR 26401 at commit d68d974.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Nov 7, 2019

Test build #113357 has finished for PR 26401 at commit c1f3473.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • case class LocalShuffleReaderExec(child: SparkPlan) extends UnaryExecNode
  • class ContinuousRecordEndpoint(buckets: Seq[Seq[UnsafeRow]], lock: Object)

@cloud-fan
Copy link
Contributor Author

thanks for the review, merging to master!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants