Skip to content

Conversation

@WangTaoTheTonic
Copy link
Contributor

It seems that the val "startTime" and "endTime" is never used, so delete them.

@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@rxin
Copy link
Contributor

rxin commented Apr 25, 2014

Jenkins, test this please.

@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@AmplabJenkins
Copy link

Merged build finished. All automated tests passed.

@AmplabJenkins
Copy link

All automated tests passed.
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/14486/

@rxin
Copy link
Contributor

rxin commented Apr 25, 2014

Thanks. I've merged this.

asfgit pushed a commit that referenced this pull request Apr 25, 2014
It seems that the val "startTime" and "endTime" is never used, so delete them.

Author: WangTao <[email protected]>

Closes #553 from WangTaoTheTonic/master and squashes the following commits:

4fcb639 [WangTao] Delete the val that never used

(cherry picked from commit 25a276d)
Signed-off-by: Reynold Xin <[email protected]>
@asfgit asfgit closed this in 25a276d Apr 25, 2014
pwendell pushed a commit to pwendell/spark that referenced this pull request Apr 27, 2014
It looks this just requires taking out the checks.

I verified that, with the patch, I was able to run spark-shell through yarn without setting the environment variable.

Author: Sandy Ryza <[email protected]>

Closes apache#553 from sryza/sandy-spark-1053 and squashes the following commits:

b037676 [Sandy Ryza] SPARK-1053.  Don't require SPARK_YARN_APP_JAR
pdeyhim pushed a commit to pdeyhim/spark-1 that referenced this pull request Jun 25, 2014
It seems that the val "startTime" and "endTime" is never used, so delete them.

Author: WangTao <[email protected]>

Closes apache#553 from WangTaoTheTonic/master and squashes the following commits:

4fcb639 [WangTao] Delete the val that never used
gzm55 pushed a commit to MediaV/spark that referenced this pull request Jul 17, 2014
It looks this just requires taking out the checks.

I verified that, with the patch, I was able to run spark-shell through yarn without setting the environment variable.

Author: Sandy Ryza <[email protected]>

Closes apache#553 from sryza/sandy-spark-1053 and squashes the following commits:

b037676 [Sandy Ryza] SPARK-1053.  Don't require SPARK_YARN_APP_JAR
erikerlandson pushed a commit to erikerlandson/spark that referenced this pull request Nov 27, 2017
bzhaoopenstack added a commit to bzhaoopenstack/spark that referenced this pull request Sep 11, 2019
…e dev" (apache#553)

Change "make bin" to "make dev"
Now we just build the binary for local host env.

Close: theopenlab/openlab#247
turboFei pushed a commit to turboFei/spark that referenced this pull request Nov 6, 2025
… expression (apache#553)

### What changes were proposed in this pull request?

This patch avoids `ArrayTransform` in `resolveArrayType` function if the resolution expression is the same as input param.

### Why are the changes needed?

Our customer encounters significant performance regression when migrating from Spark 3.2 to Spark 3.4 on a `Insert Into` query which is analyzed as a `AppendData` on an Iceberg table.
We found that the root cause is in Spark 3.4, `TableOutputResolver` resolves the query with additional `ArrayTransform` on an `ArrayType` field. The `ArrayTransform`'s lambda function is actually an identical function, i.e., the transformation is redundant.

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Unit test and manual e2e test

### Was this patch authored or co-authored using generative AI tooling?

No

Closes apache#47863 from viirya/fix_redundant_array_transform_3.5.

Authored-by: Liang-Chi Hsieh <[email protected]>

Signed-off-by: Dongjoon Hyun <[email protected]>
Co-authored-by: Liang-Chi Hsieh <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants