Skip to content

Conversation

@zhengruifeng
Copy link
Contributor

@zhengruifeng zhengruifeng commented Dec 31, 2023

What changes were proposed in this pull request?

Enable the caching provided by setup-python

Why are the changes needed?

avoid downloading the Python dependencies if possible

Does this PR introduce any user-facing change?

no, infra-only

How was this patch tested?

ci, manually check:

1, first run to cache the dependencies
https://github.com/zhengruifeng/spark/actions/runs/7363727839/job/20043467880

image

image

2, second run to reuse the cache
https://github.com/zhengruifeng/spark/actions/runs/7367425047/job/20050701350

image

Was this patch authored or co-authored using generative AI tooling?

no

@zhengruifeng zhengruifeng force-pushed the cache_non_docker_python branch from b7ffaf8 to 73dc435 Compare December 31, 2023 01:02
@@ -0,0 +1,11 @@
# PySpark dependencies for SQL tests

numpy==1.26.2
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The requirements file format allows for specifying dependency versions using logical operators (for example chardet>=3.0.4) or specifying dependencies without any versions. In this case the pip install -r requirements.txt command will always try to install the latest available package version. To be sure that the cache will be used, please stick to a specific dependency version and update it manually if necessary.

specify versions according to the suggestion in https://github.com/actions/setup-python?tab=readme-ov-file#caching-packages-dependencies

actually, I think maybe we should always specify the versions

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Related prior discussion on pinning development dependencies: #27928 (review)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually, I think maybe we should always specify the versions

I agree with this, and this is something I tried to do in the PR I linked to just above, but several committers were against it.

When I look at the number of PRs related to pinning dev dependencies over the past three years, I wonder if committers still feel the same way today.

Not pinning development dependencies creates constant breakages that can pop up whenever an upstream library releases a new version. When we pin dependencies, by contrast, we choose when to upgrade and deal with the potential breakage.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah. I don't have a great solution on this. We could have CI dedicated dep files maybe ... because now we have too many dependencies in CI with too many matrix ... but not sure .. At least, now I am not super against this idea..

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would you be open to my making another attempt at the approach in #27928? (@zhengruifeng can also take this on if they prefer, of course.)

Basically, we have two sets of development dependencies:

  • requirements.txt: direct dependencies only that are as flexible as possible; this is what devs install on their laptops
  • requirements-pinned.txt: pinned dependencies derived automatically from requirements.txt using pip-tools; this is used for CI

I know this adds a new tool that non-Python developers may not be familiar with (pip-tools), but it's extremely easy to use, has been around a long time, and is in use by many large projects, the most notable of which is Warehouse, the project that backs PyPI.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@nchammas I just notice the previous discussion #27928.

I personally prefer using requirements.txt files with pinned versions, one reason is that the dependency is actually cached in docker file, and I was confused about the version used in CI from time to time, e.g.
we used the cached RUN python3.9 -m pip install numpy pyarrow ... before, and when pyarrow 13 released at 2023-8-23, I didn't know this release broke PySpark before the cached image was refreshed (at 2023-9-13).

But I don't feel very strong about it and defer to @HyukjinKwon and @dongjoon-hyun on this.

@zhengruifeng zhengruifeng marked this pull request as draft December 31, 2023 01:18
@github-actions
Copy link

We're closing this PR because it hasn't been updated in a while. This isn't a judgement on the merit of the PR in any way. It's just a way of keeping the PR queue manageable.
If you'd like to revive this PR, please reopen it and ask a committer to remove the Stale tag!

@github-actions github-actions bot added the Stale label Apr 13, 2024
@github-actions github-actions bot closed this Apr 14, 2024
@zhengruifeng zhengruifeng deleted the cache_non_docker_python branch May 17, 2024 05:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants