Skip to content

Conversation

@dongjoon-hyun
Copy link
Member

@dongjoon-hyun dongjoon-hyun commented Oct 11, 2025

What changes were proposed in this pull request?

This PR aims to fix SparkAppDriverConf to respect sparkVersion of SparkApplication CRD.

Why are the changes needed?

This is a long standing bug from the initial implementation.

Since Apache Spark K8s Operator can launch various Spark versions, spark-version label should come from SparkApplication CRD's sparkVersion field.

However, currently, the Spark version of compile dependency is used for Driver resources (like Driver Pod and Driver Service. We should override this.

Does this PR introduce any user-facing change?

Yes, this is a bug fix to use a correct version information.

How was this patch tested?

Pass the CIs.

Was this patch authored or co-authored using generative AI tooling?

No.

*/
@Override
public scala.collection.immutable.Map<String, String> labels() {
return super.labels().updated(LABEL_SPARK_VERSION_NAME, sparkVersion);
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This overrides the underlying Spark version (from the compile dependency), 4.1.0-preview2.

@dongjoon-hyun
Copy link
Member Author

cc @jiangzho , @peter-toth , @viirya .

@dongjoon-hyun
Copy link
Member Author

Thank you, @viirya ! Merged to main.

@dongjoon-hyun dongjoon-hyun deleted the SPARK-53874 branch October 11, 2025 07:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants