Skip to content

Conversation

@dependabot
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Jan 18, 2024

Bumps pyspark from 3.4.2 to 3.5.0.

Commits
  • ce5ddad Preparing Spark release v3.5.0-rc5
  • 05b57bc [SPARK-45075][SQL] Fix alter table with invalid default value will not report...
  • b2b2ba9 [SPARK-44805][SQL] getBytes/getShorts/getInts/etc. should work in a column ve...
  • 695be32 [SPARK-45106][SQL] PercentileCont should check user supplied input
  • 0662175 [SPARK-45098][DOCS] Custom jekyll-rediect-from redirect.html template to fix ...
  • 8f730e7 [SPARK-45100][SQL] Fix an internal error from reflect()on NULL class and ...
  • 959d93a [SPARK-44508][PYTHON][DOCS] Add user guide for Python user-defined table func...
  • 3ceec3b [SPARK-44835][CONNECT] Make INVALID_CURSOR.DISCONNECTED a retriable error
  • a9d601c [SPARK-44640][PYTHON][FOLLOW-UP][3.5] Update UDTF error messages to include m...
  • 916b6f5 [SPARK-45050][SQL][CONNECT] Improve error message for UNKNOWN io.grpc.StatusR...
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Jan 18, 2024
@HonahX
Copy link
Contributor

HonahX commented Jan 19, 2024

Mark: we also need to update the iceberg-spark-runtime used here:

@pytest.fixture(scope="session")
def spark() -> SparkSession:
import os
os.environ["PYSPARK_SUBMIT_ARGS"] = (
"--packages org.apache.iceberg:iceberg-spark-runtime-3.4_2.12:1.4.0,org.apache.iceberg:iceberg-aws-bundle:1.4.0 pyspark-shell"
)

@Fokko
Copy link
Contributor

Fokko commented Jan 19, 2024

@HonahX Good one! Maybe we should pass that in through an environment variable?

@HonahX
Copy link
Contributor

HonahX commented Jan 21, 2024

Hi @Fokko.

Maybe we should pass that in through an environment variable

Do you mean putting it in a .env file? Also, I am thinking if it is possible to link the iceberg-spark-runtime 's version with our pyspark version in pyproject.toml.

spark_version = ".".join(importlib.metadata.version("pyspark").split(".")[:2])

    os.environ["PYSPARK_SUBMIT_ARGS"] = (
        f"--packages org.apache.iceberg:iceberg-spark-runtime-{spark_version}_2.12:1.4.0,org.apache.iceberg:iceberg-aws-bundle:1.4.0 pyspark-shell"
    )

So that next time the dependabot can automatically update pyspark or fail if iceberg does not support the version yet. Does this sound good to you?

@Fokko
Copy link
Contributor

Fokko commented Jan 23, 2024

@HonahX I was thinking of adding this to the Dockerfile through an ENV step. Ah wait, the tests run outside of the container. I like your suggestion a lot with using the importlib 👍

@dependabot dependabot bot force-pushed the dependabot/pip/pyspark-3.5.0 branch 2 times, most recently from 0f76f95 to 1605c0e Compare January 23, 2024 21:45
Bumps [pyspark](https://github.com/apache/spark) from 3.4.2 to 3.5.0.
- [Commits](apache/spark@v3.4.2...v3.5.0)

---
updated-dependencies:
- dependency-name: pyspark
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot force-pushed the dependabot/pip/pyspark-3.5.0 branch from 1605c0e to 658025c Compare January 24, 2024 08:34
@HonahX
Copy link
Contributor

HonahX commented Jan 25, 2024

@Fokko I opened a new PR to manually update the Pyspark version and use importlib to fetch version:
#303

@dependabot @github
Copy link
Contributor Author

dependabot bot commented on behalf of github Jan 25, 2024

Looks like pyspark is up-to-date now, so this is no longer needed.

@dependabot dependabot bot closed this Jan 25, 2024
@dependabot dependabot bot deleted the dependabot/pip/pyspark-3.5.0 branch January 25, 2024 10:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file python Pull requests that update Python code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants