Skip to content

Conversation

@amanomer
Copy link
Contributor

What changes were proposed in this pull request?

To make SparkSQL's cast as double behavior consistent with PostgreSQL when spark.sql.dialect is configured as PostgreSQL.

Why are the changes needed?

SparkSQL and PostgreSQL have a lot different cast behavior between types by default. We should make SparkSQL's cast behavior be consistent with PostgreSQL when spark.sql.dialect is configured as PostgreSQL.

Does this PR introduce any user-facing change?

Yes. If user switches to PostgreSQL dialect now, they will

  • get an AnalysisException when they input any data type is booleanType, StringType or DateType.

How was this patch tested?

Added test cases.

@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@dongjoon-hyun
Copy link
Member

Thank you for contribution, @amanomer . As you know, unfortunately, we decided to remove PostgreSQL dialect via SPARK-30125 (#26763). Sorry about that. I'll close this PR, too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants