Skip to content

Conversation

@JoshRosen
Copy link
Contributor

@JoshRosen JoshRosen commented Jun 4, 2022

What changes were proposed in this pull request?

This is a followup to #36654. That PR modified the existing QueryPlan.transformDownWithSubqueries to add additional arguments for tree pattern pruning.

In this PR, I roll back the change to that method's signature and instead add a new transformDownWithSubqueriesAndPruning method.

Why are the changes needed?

The original change breaks binary and source compatibility in Catalyst. Technically speaking, Catalyst APIs are considered internal to Spark and are subject to change between minor releases (see source), but I think it's nice to try to avoid API breakage when possible.

While trying to compile some custom Catalyst code, I ran into issues when trying to call the transformDownWithSubqueries method without supplying a tree pattern filter condition. If I do transformDownWithSubqueries() { f } then I get a compilation error. I think this is due to the first parameter group containing all default parameters.

My PR's solution of adding a new transformDownWithSubqueriesAndPruning method solves this problem. It's also more consistent with the naming convention used for other pruning-enabled tree transformation methods.

Does this PR introduce any user-facing change?

No.

How was this patch tested?

Existing tests.

@github-actions github-actions bot added the SQL label Jun 4, 2022
@JoshRosen JoshRosen changed the title [SPARK-39259][FOLLOWUP] Fix source and binary incompatibilities in transformDownWithSubqueries [SPARK-39259][SQL][FOLLOWUP] Fix source and binary incompatibilities in transformDownWithSubqueries Jun 4, 2022
@MaxGekk
Copy link
Member

MaxGekk commented Jun 4, 2022

+1, LGTM. Merging to master/3.3.
Thank you, @JoshRosen.

@MaxGekk MaxGekk closed this in eda6c4b Jun 4, 2022
MaxGekk pushed a commit that referenced this pull request Jun 4, 2022
…in transformDownWithSubqueries

### What changes were proposed in this pull request?

This is a followup to #36654. That PR modified the existing `QueryPlan.transformDownWithSubqueries` to add additional arguments for tree pattern pruning.

In this PR, I roll back the change to that method's signature and instead add a new `transformDownWithSubqueriesAndPruning` method.

### Why are the changes needed?

The original change breaks binary and source compatibility in Catalyst. Technically speaking, Catalyst APIs are considered internal to Spark and are subject to change between minor releases (see [source](https://github.com/apache/spark/blob/bb51add5c79558df863d37965603387d40cc4387/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/package.scala#L20-L24)), but I think it's nice to try to avoid API breakage when possible.

While trying to compile some custom Catalyst code, I ran into issues when trying to call the `transformDownWithSubqueries` method without supplying a tree pattern filter condition. If I do `transformDownWithSubqueries() { f} ` then I get a compilation error. I think this is due to the first parameter group containing all default parameters.

My PR's solution of adding a new `transformDownWithSubqueriesAndPruning` method solves this problem. It's also more consistent with the naming convention used for other pruning-enabled tree transformation methods.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Existing tests.

Closes #36765 from JoshRosen/SPARK-39259-binary-compatibility-followup.

Authored-by: Josh Rosen <[email protected]>
Signed-off-by: Max Gekk <[email protected]>
(cherry picked from commit eda6c4b)
Signed-off-by: Max Gekk <[email protected]>
* first to this node, then this node's subqueries and finally this node's children.
* When the partial function does not apply to a given node, it is left unchanged.
*/
def transformDownWithSubqueriesAndPruning(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was about to say we shouldn't make these changes for binary compatibility for internal API (e.g., #35378) but reading the codes, it looks more like a refactoring. So LGTM from me 2.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants