Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
87 commits
Select commit Hold shift + click to select a range
ebf7612
Hopsfy spark-3.5
gibchikafa Feb 18, 2025
9bdf676
Drop Index support, not compatible with the hive version
gibchikafa Feb 21, 2025
d871e4c
DaysWritableV2
gibchikafa Feb 21, 2025
2135dcf
Exclude com.vlkan:flatbuffers:1.2.0 dependency from build
gibchikafa Feb 21, 2025
3056fcc
Apply uniffle changes
gibchikafa Feb 24, 2025
9374587
Reduce number of event logs flushes (#26)
SirOibaf Oct 5, 2021
eeb2aaa
[HWORKS-1405] Get correct hive version in spark (#40)
gibchikafa Jul 4, 2024
0c3b65b
Timestamp incompatibility Spark/Hive/Hudi - Hive fix - release 3.1.1.3
gibchikafa Feb 21, 2025
0bea3ba
Bump hive version to 3.0.0.13.5 (#39)
gibchikafa Jun 26, 2024
f7d50b2
Change hadoop version
gibchikafa Feb 20, 2025
2a5fee4
Add Hops Repo
gibchikafa Jun 11, 2025
4688cfd
[SPARK-52420][PYTHON][TESTS][FOLLOW-UP][3.5] Make test_udtf_with_inva…
HyukjinKwon Jun 14, 2025
5f98a22
Change version to 3.5.8
gibchikafa Jun 16, 2025
e03729f
Change kafka version to 2.6.0
gibchikafa Jun 16, 2025
8efa025
Change version to 3.5.5
gibchikafa Jun 19, 2025
02f7fb3
[SPARK-52339][SQL][3.5] Fix comparison of `InMemoryFileIndex` instances
bersprockets Jun 24, 2025
a136421
[SPARK-52562][INFRA] Automatically create the base of release notes a…
HyukjinKwon Jun 24, 2025
80a2098
[SPARK-52339][SQL][FOLLOWUP] Sort paths in InMemoryFileIndex#equal on…
yaooqinn Jun 25, 2025
185380c
[SPARK-52584][BUILD] Make build script to support preview releases in…
HyukjinKwon Jun 26, 2025
53a22c0
[SPARK-52568][BUILD][3.5] Fix `exec-maven-plugin` version used by `de…
pan3793 Jun 27, 2025
87f58c0
Add org.codehaus.jackson
gibchikafa Jun 29, 2025
7d8a596
Reset kafka version to 3.4.1
gibchikafa Jun 29, 2025
57da485
[SPARK-52611][SQL] Fix SQLConf version for excludeSubqueryRefsFromRem…
Jun 30, 2025
a53a9c4
[SPARK-52381][CORE][3.5] JsonProtocol: Only accept subclasses of Spar…
pjfanning Jun 30, 2025
3f2a3ba
[SPARK-52023][SQL] Fix data corruption/segfault returning Option[Prod…
eejbyfeldt Jul 1, 2025
a56879e
Revert "[SPARK-52023][SQL] Fix data corruption/segfault returning Opt…
dongjoon-hyun Jul 1, 2025
1c408c3
[SPARK-52023][SQL][3.5] Fix data corruption/segfault returning Option…
eejbyfeldt Jul 2, 2025
20c9add
[SPARK-52635][BUILD][3.5] Upgrade ORC to 1.9.7
dongjoon-hyun Jul 4, 2025
029503a
[SPARK-52684][SQL] Make CACHE TABLE Commands atomic while encounterin…
yaooqinn Jul 7, 2025
1832d01
[SPARK-52707][BUILD] Remove preview postfix when looking up the JIRA …
HyukjinKwon Jul 8, 2025
fb9cd10
[SPARK-52721][PYTHON] Fix message parameter for CANNOT_PARSE_DATATYPE
yaooqinn Jul 9, 2025
6103272
[SPARK-52749][BUILD] Replace preview1 to dev1 in its PyPI package nam…
HyukjinKwon Jul 10, 2025
157c7ec
[SPARK-52721][PYTHON][HOTFIX] Fix message parameter for CANNOT_PARSE_…
yaooqinn Jul 10, 2025
218d292
[SPARK-52809][SQL] Don't hold reader and iterator references for all …
viirya Jul 16, 2025
eb123a1
[SPARK-46941][SQL][3.5] Can't insert window group limit node for top-…
zml1206 Jul 16, 2025
8d85c5a
[SPARK-52776][CORE][3.5] Do not split the comm field in ProcfsMetrics…
Jul 16, 2025
eef9576
Revert "Preparing development version 3.5.8-SNAPSHOT"
HyukjinKwon Jul 17, 2025
baa514f
Revert "Preparing Spark release v3.5.7-rc1"
HyukjinKwon Jul 17, 2025
ebe6ca8
[SPARK-52516][SQL] Don't hold previous iterator reference after advan…
viirya Jul 18, 2025
98645a2
[SPARK-52791][PS] Fix error when inferring a UDT with a null first el…
petern48 Jul 23, 2025
80c1f5f
[SPARK-52737][CORE] Pushdown predicate and number of apps to FsHistor…
shardulm94 Jul 26, 2025
5ccd68b
[SPARK-52944][CORE][SQL][YARN][TESTS][3.5] Fix invalid assertions in …
LuciferYang Jul 28, 2025
0fa4507
[SPARK-52945][SQL][TESTS] Split `CastSuiteBase#checkInvalidCastFromNu…
LuciferYang Jul 28, 2025
a137e4b
Try to create a workflow to build 3.5
vatj Aug 1, 2025
1632daa
Add settings to mvn command
vatj Aug 1, 2025
22355ba
[SPARK-53054][CONNECT][3.5] Fix the connect.DataFrameReader default f…
dillitz Aug 1, 2025
ae70bf9
Fix bootstrapper
vatj Aug 4, 2025
db1b30b
Again
vatj Aug 4, 2025
2524c0a
Revert "[SPARK-49182][DOCS][PYTHON] Stop publish site/docs/{version}/…
HyukjinKwon Aug 5, 2025
665ccb3
[SPARK-53094][SQL][3.5] Fix CUBE with aggregate containing HAVING cla…
peter-toth Aug 6, 2025
71ab2cc
[SPARK-53155][SQL] Global lower agggregation should not be replaced w…
viirya Aug 7, 2025
4e9dbc8
Bump hopsfs
vatj Aug 8, 2025
3f87386
Use JIRA tag version for hopsfs
vatj Aug 12, 2025
33a2aa8
[SPARK-48746][PYTHON][SS][TESTS] Avoid using global temp view in fore…
HyukjinKwon Jun 28, 2024
2eeeac8
[MINOR][PYTHON][TESTS] Use different temp table name in foreachBatch …
HyukjinKwon Aug 13, 2025
a20cd2e
snapshots distrib management
vatj Aug 14, 2025
f9a5c8c
[SPARK-49872][CORE] Remove jackson JSON string length limitation
cloud-fan Aug 19, 2025
f7e85e0
[SPARK-52873][SQL][3.5] Further restrict when SHJ semi/anti join can …
bersprockets Aug 27, 2025
5afcc0a
Test with arrow upgrade
vatj Aug 27, 2025
8a51801
Use hopsfs 3.2.0.17-EE-RC1
vatj Aug 27, 2025
11e3f3b
Use arrow jar compatible with java8
vatj Sep 1, 2025
494fd9c
Add build workflow
vatj Sep 2, 2025
2cb458a
Add workflow dispatch
vatj Sep 2, 2025
3a449e8
add build arg
vatj Sep 2, 2025
c6a95be
[SPARK-53435][SQL][3.5] Fix race condition in CachedRDDBuilder
liuzqt Sep 2, 2025
55f75df
[MINOR][BUILD] Fix download of preview releases in the news
HyukjinKwon Sep 3, 2025
4508f9c
[MINOR][BUILD] Remove todos for testing in the real releases
HyukjinKwon Sep 3, 2025
d39d1e0
[MINOR][BUILD] Remove `preview` postfix in documentation.md when rele…
HyukjinKwon Sep 3, 2025
e93e70c
Fix typos
vatj Sep 3, 2025
cf7f364
Add workflow call outputs
vatj Sep 3, 2025
2807c0e
Test
vatj Sep 3, 2025
1356f00
Build spark with arrow12
vatj Sep 3, 2025
1e2bd68
Use 31
vatj Sep 3, 2025
cf22165
Test explicit
vatj Sep 3, 2025
c8f2665
undo flatbuffers
vatj Sep 3, 2025
6c1c512
[SPARK-53472][DOCS] Fix jekyll-redirect-from template and generated h…
yaooqinn Sep 4, 2025
20afdc9
Merge branch 'branch-3.5' into HWORKS-2203-vatj
vatj Sep 4, 2025
0aa117d
Remove snapshot
vatj Sep 4, 2025
4409035
Save all
vatj Sep 4, 2025
3a26972
implicit path error
vatj Sep 5, 2025
107151d
Move implicit path ordering
vatj Sep 5, 2025
a18909c
Exclude google flatbuffers
vatj Sep 5, 2025
7142ee9
Remove InMemoryFileIndex fix
vatj Sep 5, 2025
97a7daa
Test with new hive
vatj Sep 5, 2025
760012f
Add HiveEE to known repositories
vatj Sep 5, 2025
b9b2c5c
Edit workflow
vatj Sep 5, 2025
1aadfba
Update spark version to 3.5.5
vatj Sep 8, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
48 changes: 48 additions & 0 deletions .github/workflows/build_branch35.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#

name: "Build (branch-3.5, Scala 2.13, Hadoop 3, JDK 8)"

on:
pull_request:

jobs:
run-build:
permissions:
packages: write
name: Run
uses: ./.github/workflows/build_and_test.yml
if: github.repository == 'logicalclocks/spark'
with:
java: 8
branch: branch-3.5
hadoop: hadoop3
envs: >-
{
"SCALA_PROFILE": "scala2.13"
}
jobs: >-
{
"build": "true",
"pyspark": "true",
"sparkr": "true",
"tpcds-1g": "true",
"docker-integration-tests": "true",
"lint" : "true"
}
1 change: 1 addition & 0 deletions .github/workflows/build_main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ on:
push:
branches:
- '**'
pull_request:

jobs:
call-build-and-test:
Expand Down
204 changes: 204 additions & 0 deletions .github/workflows/build_spark_with_hopsfs.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,204 @@
name: Build Spark with hopsfs

on:
workflow_call:
inputs:
ref:
description: 'The ref to checkout for the spark repo, default is branch-3.5'
required: false
type: string
default: 'branch-3.5'
jira_tag:
description: 'The tag to use for the jira release, default is the version from version.txt'
required: false
type: string
default: 'NOJIRA'
runner:
description: 'The type of runner to use, default is ghrunner-ee8'
required: false
type: string
default: 'ghrunner-ee8'
build:
description: 'Whether to build spark or not, default is false. If this is false then the workflow will only prepare the versioning related outputs.'
required: false
type: boolean
default: true
secrets:
NEXUS_HARBOR_PASSWORD:
required: true
outputs:
pom_version_no_jira:
value: ${{ jobs.build-spark.outputs.pom_version_no_jira }}
description: 'The pom version without the jira tag'
pom_version:
value: ${{ jobs.build-spark.outputs.pom_version }}
description: 'The pom version with the jira tag'
commit_hash:
value: ${{ jobs.build-spark.outputs.commit_hash }}
description: 'The commit hash of the spark repo'
jira_tag:
value: ${{ jobs.build-spark.outputs.jira_tag }}
description: 'The jira tag used for the build'
spark_tar_name:
value: ${{ jobs.build-spark.outputs.spark_tar_name }}
description: 'The name of the spark tar file'
spark_tar_url:
value: ${{ jobs.build-spark.outputs.spark_tar_url }}
description: 'The url of the spark tar file'
hopsfs_version:
value: ${{ jobs.build-spark.outputs.hopsfs_version }}
description: 'The version of hopsfs used in the build'
workflow_dispatch:
inputs:
ref:
description: 'The ref to checkout for the spark repo, default is branch-3.5'
required: false
type: string
default: 'branch-3.5'
jira_tag:
description: 'The tag to use for the jira release, default is the version from version.txt'
required: false
type: string
default: 'NOJIRA'
runner:
description: 'The type of runner to use, default is ghrunner-ee8'
required: false
type: string
default: 'ghrunner-ee8'
build:
description: 'Whether to build spark or not, default is false. If this is false then the workflow will only prepare the versioning related outputs.'
required: false
type: boolean
default: true
pull_request:

concurrency:
group: build-spark-${{ github.workflow }}-${{ github.job }}-${{ inputs.jira_tag || 'NOJIRA' }}
cancel-in-progress: true

# Used to avoid error on PRs
env:
# SPARK_REF: ${{ inputs.ref || 'branch-3.5' }}
SPARK_REF: ${{ inputs.ref || 'HWORKS-2203-vatj' }}
JIRA_TAG: ${{ inputs.jira_tag || 'NOJIRA' }}

jobs:
build-spark:
runs-on: ${{ inputs.runner }}
outputs:
pom_version_no_jira: ${{ steps.prep_version.outputs.pom_version_no_jira }}
pom_version: ${{ steps.prep_version.outputs.pom_version }}
commit_hash: ${{ steps.prep_version.outputs.commit_hash }}
jira_tag: ${{ env.JIRA_TAG }}
spark_tar_name: ${{ steps.prep_version.outputs.spark_tar_name }}
spark_tar_url: ${{ steps.prep_version.outputs.spark_tar_url }}
hopsfs_version: ${{ steps.prep_version.outputs.hopsfs_version }}
steps:
- name: Checkout spark repo
uses: actions/checkout@v4
with:
repository: logicalclocks/spark
ref: ${{ env.SPARK_REF }}
path: ${{ github.workspace }}/spark

- name: To build or not to build
id: to_build_or_not_to_build
shell: bash
env:
BUILD_SPARK: ${{ (github.event_name == 'pull_request' && contains(join(github.event.pull_request.labels.*.name, ','), 'build-spark')) || inputs.build }}
run: |
if [[ "${{ env.BUILD_SPARK }}" != "true" ]]; then
echo "# :recycle: Not building Spark" >> $GITHUB_STEP_SUMMARY
if [[ "${{ github.event_name }}" == "pull_request" ]]; then
echo "This is a pull request and the 'build-spark' label is not present." >> $GITHUB_STEP_SUMMARY
echo "pull_request_labels=${{ join(github.event.pull_request.labels.*.name, ', ') }}" >> $GITHUB_STEP_SUMMARY
elif [[ "${{ inputs.build || 'false'}}" != "true" ]]; then
echo "The input 'build' is set to false." >> $GITHUB_STEP_SUMMARY
fi
echo "BUILD_SPARK=$BUILD_SPARK" >> $GITHUB_OUTPUT
else
echo "# :white_check_mark: Building Spark" >> $GITHUB_STEP_SUMMARY
echo "BUILD_SPARK=$BUILD_SPARK" >> $GITHUB_OUTPUT
fi

- name: Prep step version
shell: bash
id: prep_version
working-directory: ${{ github.workspace }}/spark
run: |
COMMIT_HASH=$(git rev-parse --short HEAD)
POM_VERSION_NO_JIRA=$(mvn -q -Dexec.executable="echo" -Dexec.args='${project.version}' --non-recursive exec:exec)
find . -name "pom.xml" -exec sed -i "s|<version>${POM_VERSION_NO_JIRA}</version>|<version>${POM_VERSION_NO_JIRA%-SNAPSHOT}-${JIRA_TAG}-SNAPSHOT</version>|g" {} \;
POM_VERSION=$(mvn -q -Dexec.executable="echo" -Dexec.args='${project.version}' --non-recursive exec:exec)
SPARK_TAR_NAME=spark-${POM_VERSION}-bin-without-hadoop-with-hive.tgz
SPARK_TAR_URL="${{ vars.NEXUS_DEV_SPARK_URL }}/${JIRA_TAG}/${SPARK_TAR_NAME}"
HOPSFS_VERSION=$(mvn -q -Dexec.executable="echo" -Dexec.args='${hadoop.version}' --non-recursive exec:exec)

echo "POM_VERSION_NO_JIRA=${POM_VERSION_NO_JIRA}" >> $GITHUB_ENV
echo "POM_VERSION=${POM_VERSION}" >> $GITHUB_ENV
echo "COMMIT_HASH=$COMMIT_HASH" >> $GITHUB_ENV
echo "SPARK_TAR_NAME=${SPARK_TAR_NAME}" >> $GITHUB_ENV
echo "SPARK_TAR_URL=${SPARK_TAR_URL}" >> $GITHUB_ENV
echo "HOPSFS_VERSION=${HOPSFS_VERSION}" >> $GITHUB_ENV

echo "POM_VERSION_NO_JIRA=${POM_VERSION_NO_JIRA}" >> $GITHUB_STEP_SUMMARY
echo "POM_VERSION=${POM_VERSION}" >> $GITHUB_STEP_SUMMARY
echo "COMMIT_HASH=$COMMIT_HASH" >> $GITHUB_STEP_SUMMARY
echo "SPARK_TAR_NAME=${SPARK_TAR_NAME}" >> $GITHUB_STEP_SUMMARY
echo "SPARK_TAR_URL=${SPARK_TAR_URL}" >> $GITHUB_STEP_SUMMARY
echo "HOPSFS_VERSION=${HOPSFS_VERSION}" >> $GITHUB_STEP_SUMMARY

echo "POM_VERSION=${POM_VERSION}" >> $GITHUB_OUTPUT
echo "POM_VERSION_NO_JIRA=${POM_VERSION_NO_JIRA}" >> $GITHUB_OUTPUT
echo "COMMIT_HASH=$COMMIT_HASH" >> $GITHUB_OUTPUT
echo "SPARK_TAR_NAME=${SPARK_TAR_NAME}" >> $GITHUB_OUTPUT
echo "SPARK_TAR_URL=${SPARK_TAR_URL}" >> $GITHUB_OUTPUT
echo "HOPSFS_VERSION=${HOPSFS_VERSION}" >> $GITHUB_OUTPUT

- name: Set up .m2 settings.xml
shell: bash
if: steps.to_build_or_not_to_build.outputs.BUILD_SPARK == 'true'
env:
M2_HOME: ~/.m2
run: |
echo "M2_HOME var is $M2_HOME" >> $GITHUB_STEP_SUMMARY
mkdir -p ~/.m2
echo "<settings><servers>" > ~/.m2/settings.xml
echo "<server><id>HopsEE</id><username>${{ vars.NEXUS_HARBOR_USER }}</username><password>${{ secrets.NEXUS_HARBOR_PASSWORD }}</password><configuration></configuration></server>" >> ~/.m2/settings.xml
echo "<server><id>HiveEE</id><username>${{ vars.NEXUS_HARBOR_USER }}</username><password>${{ secrets.NEXUS_HARBOR_PASSWORD }}</password><configuration></configuration></server>" >> ~/.m2/settings.xml
echo "</servers></settings>" >> ~/.m2/settings.xml


- name: Cache maven
id: cache-maven
if: steps.to_build_or_not_to_build.outputs.BUILD_SPARK == 'true'
uses: actions/cache@v4
with:
path: |
~/.m2
!~/.m2/settings.xml
key: ${{ runner.os }}-maven-spark-${{ hashFiles('spark/**/pom.xml') }}
restore-keys: |
${{ runner.os }}-maven-spark-


- name: Build spark and spark-packaging
shell: bash
if: steps.to_build_or_not_to_build.outputs.BUILD_SPARK == 'true'
working-directory: ${{ github.workspace }}/spark
env:
POM_VERSION: ${{ env.POM_VERSION }}
M2_HOME: ~/.m2
run: |
./dev/make-distribution.sh --name without-hadoop-with-hive --tgz "-Pkubernetes,hadoop-provided,parquet-provided,hive,hadoop-cloud,bigtop-dist"

- name: Upload spark-packaging artifact to Nexus
shell: bash
if: steps.to_build_or_not_to_build.outputs.BUILD_SPARK == 'true'
working-directory: ${{ github.workspace }}/spark
env:
M2_HOME: ~/.m2
run: |
curl -u ${{ vars.NEXUS_HARBOR_USER }}:${{ secrets.NEXUS_HARBOR_PASSWORD }} --upload-file spark-$POM_VERSION-bin-without-hadoop-with-hive.tgz "${SPARK_TAR_URL}"
export MAVEN_OPTS="${MAVEN_OPTS:--Xss128m -Xmx4g -XX:ReservedCodeCacheSize=128m}"
./build/mvn deploy -DskipTests -Dmaven.javadoc.skip=true -Dmaven.scaladoc.skip=true -Dmaven.source.skip -Dcyclonedx.skip=true -Pkubernetes,hadoop-provided,parquet-provided,hive,hadoop-cloud
2 changes: 1 addition & 1 deletion R/pkg/DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Package: SparkR
Type: Package
Version: 3.5.8
Version: 3.5.7
Title: R Front End for 'Apache Spark'
Description: Provides an R Front end for 'Apache Spark' <https://spark.apache.org>.
Authors@R:
Expand Down
4 changes: 2 additions & 2 deletions assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.5.8-SNAPSHOT</version>
<version>3.5.5</version>
<relativePath>../pom.xml</relativePath>
</parent>

Expand Down Expand Up @@ -137,7 +137,7 @@
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<groupId>${hadoop.group}</groupId>
<artifactId>hadoop-yarn-server-web-proxy</artifactId>
</dependency>
</dependencies>
Expand Down
2 changes: 2 additions & 0 deletions assembly/src/main/assembly/assembly.xml
Original file line number Diff line number Diff line change
Expand Up @@ -83,6 +83,8 @@
<useProjectArtifact>false</useProjectArtifact>
<excludes>
<exclude>org.apache.hadoop:*:jar</exclude>
<exclude>io.hops:*:jar</exclude>
<exclude>io.hops.metadata:*:jar</exclude>
<exclude>org.apache.spark:*:jar</exclude>
<exclude>org.apache.zookeeper:*:jar</exclude>
<exclude>org.apache.avro:*:jar</exclude>
Expand Down
2 changes: 1 addition & 1 deletion common/kvstore/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.5.8-SNAPSHOT</version>
<version>3.5.5</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/network-common/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.5.8-SNAPSHOT</version>
<version>3.5.5</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/network-shuffle/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.5.8-SNAPSHOT</version>
<version>3.5.5</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
6 changes: 3 additions & 3 deletions common/network-yarn/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.5.8-SNAPSHOT</version>
<version>3.5.5</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down Expand Up @@ -64,12 +64,12 @@

<!-- Provided dependencies -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<groupId>${hadoop.group}</groupId>
<artifactId>hadoop-client-api</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<groupId>${hadoop.group}</groupId>
<artifactId>hadoop-client-runtime</artifactId>
<version>${hadoop.version}</version>
</dependency>
Expand Down
2 changes: 1 addition & 1 deletion common/sketch/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.5.8-SNAPSHOT</version>
<version>3.5.5</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/tags/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.5.8-SNAPSHOT</version>
<version>3.5.5</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/unsafe/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.5.8-SNAPSHOT</version>
<version>3.5.5</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/utils/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.5.8-SNAPSHOT</version>
<version>3.5.5</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion connector/avro/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.5.8-SNAPSHOT</version>
<version>3.5.5</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
Loading
Loading