Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

uv lock is extremely slow with google artifact registry #6104

Closed
ewianda opened this issue Aug 15, 2024 · 11 comments · Fixed by #6470
Closed

uv lock is extremely slow with google artifact registry #6104

ewianda opened this issue Aug 15, 2024 · 11 comments · Fixed by #6470
Assignees
Labels
performance Potential performance improvement

Comments

@ewianda
Copy link

ewianda commented Aug 15, 2024

Before the merge of PR #5089, using uv lock with Google Artifact Registry was quite efficient. However, post-merge, the locking process for a project with around 200 requirements has drastically slowed down, now taking up to an hour. Additionally, the cache size has escalated to approximately 300 GB, which seems excessive.

As a comparison, reconfiguring the repository to use PyPI reduces the lock time to under 3 minutes without utilizing the cache.

Thank you for looking into this matter. I appreciate your efforts in maintaining and improving this tool.

@charliermarsh
Copy link
Member

Wow, interesting that it was so much faster before! Hmm, ok. We'll likely need to revert then. Is there any chance you could share --verbose logs of a lock step on Google Artifact Registry?

@charliermarsh charliermarsh self-assigned this Aug 15, 2024
@charliermarsh charliermarsh added the performance Potential performance improvement label Aug 15, 2024
@ewianda
Copy link
Author

ewianda commented Aug 15, 2024

Sure I can share the logs. Do you want me to run the lock to completion, or just for a while and stop

@charliermarsh
Copy link
Member

A run to completion would be great, but it doesn't need to be the entire set of 200 requirements. Just a representative sample would be great. Ideally with -vv.

@ewianda
Copy link
Author

ewianda commented Aug 15, 2024

uv.log

I ran it for just 1 package (apache-beam[gcs]) and it took almost a minute.

@charliermarsh
Copy link
Member

Thanks. It's such a shame that the registry doesn't support PEP 658 or range requests. I'm amazed by how many commercial registries don't at least support the standard.

@morotti
Copy link

morotti commented Aug 15, 2024

(not a uv dev)

In my experience, I've found that apache-beam and many google libraries have the issue of pinning highly specific versions of dependencies, notably protobuf. I often see pip running into troubles trying to resolve the massive dependency tree with deep conflicts.

it may be worth trying to install your package by forcing a specific version of protobuf. does that get the installation to resolve faster?
manuv pip install yourpackage protobuf==4.25.4
manuv pip install yourpackage protobuf==3.20.3
manuv pip install yourpackage protobuf==5.27.3

from the logs:

4.831562s   4s  DEBUG uv_resolver::resolver Adding transitive dependency for apache-beam==2.58.0: protobuf>=3.20.3, <4.0.dev0 | >=4.1.dev0, <4.21.dev0 | >=4.22.dev0, <4.22.0 | >4.22.0, <4.23.dev0 | >=4.25.dev0, <4.26.0

@ewianda
Copy link
Author

ewianda commented Aug 15, 2024

I did do that, basically, I exported my requirment.txt with all the pinned version into pyproject.toml to see if that will limit the resolution but that didn't make a difference.

@charliermarsh
Copy link
Member

I did do that, basically, I exported my requirment.txt with all the pinned version into pyproject.toml to see if that will limit the resolution but that didn't make a difference.

Is this materially slower than when you uv pip sync those specific versions?

@ewianda
Copy link
Author

ewianda commented Aug 15, 2024

I did do that, basically, I exported my requirment.txt with all the pinned version into pyproject.toml to see if that will limit the resolution but that didn't make a difference.

Is this materially slower than when you uv pip sync those specific versions?

Not it is not. Though I am getting a hash mis match error

warning: `uv sync` is experimental and may change without warning
Resolved 83 packages in 14ms
error: Failed to prepare distributions
  Caused by: Failed to fetch wheel: httplib2==0.22.0
  Caused by: Hash mismatch for `httplib2==0.22.0`

Expected:
  sha256:d7a10bc5ef5ab08322488bde8c726eeee5c8618723fdb399597ec58f3d82df81

Computed:
  sha256:0f3338153b99b37ab0cdff4613b2b4f391e7b4cf9daa07a59f40c88a39b87d5a

@charliermarsh
Copy link
Member

Does your package have an invalid hash? Have you checked what the registry provides vs. the true hash of the distribution?

@ewianda
Copy link
Author

ewianda commented Aug 15, 2024

Hmm, acutally it is due to the way are setting up google artifact registry

We have a Virtual repository that points to a standard repository ( internal packages) and to pypi as a remote repository

It seems the virtual repository only provides the distribution ( sha256:d7a10bc5ef5ab08322488bde8c726eeee5c8618723fdb399597ec58f3d82df81) , while the remote repository has both wheel and sdist files.

TL;DR
The sync fails with virtual repositories and works with a remote repository on google artifact.

Note: Changing the lock process to use the remote-repository doesn't change the lock time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
performance Potential performance improvement
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants