-
Notifications
You must be signed in to change notification settings - Fork 772
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support pip download
#3163
Comments
Hi everyone, is this feature on the roadmap? I am guessing supporting "pip download" would be straightforward since uv already downloads packages. It would just have to not install them. Any help needed? |
We should be able to support it... Though it's not trivial because we don't store the What are the typical use-cases here? |
In my case it's a docker build in a github workflow. Caching docker layers on github runners is impossible AFAIK. Caching
Which wouldn't hit pypi and wouldn't need any additional caching from docker. ... This can be even better if you can do |
I'm mostly wondering if it has to be wheels or if we could just make it easy to pre-populate the uv cache. |
wheels are supported by standard |
I'm not sure how much we should go out of our way to support using pip to consume an output of uv? It seems weird to use uv in one case and pip in another, right? |
If it were equally easy for us I'd probably prefer to output wheels, it's a nicer intermediary format that's less coupled to our internal cache. I'd need to see how hard it is to support. |
Hi, my use case is that I have to supply bundles of my application with all dependencies for systems where it is not possible to download them (firewall blocking). Right now I use pip download which results in a bunch of wheel files. Also we would like to be able to cross platform download them too |
That makes sense, thanks. |
That part of the workflow may not be entirely under your control. |
Note, both As a result, I tend to just always use Some of these tradeoffs were actually discussed in #1681. |
Hi @samypr100 in this specific case I really just want to download pre built wheels and not build anything. I can As for pip wheel, I can't cross platform download (nor compile) anything |
|
Also quite interested in this as In our case our security scanning tool requires to run on a folder of wheel / source distributions, we currently use pip download to gather these. |
Another use case would be downloading build time dependencies (in addition to runtime dependencies). I'm not sure if this is feasible, since it is not supported by pip (pypa/pip#7863). However, this would be extremely useful when building a Flatpak which involves Python packages, which is currently broken because of that (flatpak/flatpak-builder-tools#380). |
Although
It would be even better if the |
We have a business use case to scan dependencies of a Python project, we need to pip download requirements, it's slow without uv 😒 |
Not saying you shouldn't use uv, but do you have an example where pip 24.2 is slow at downloading? Especially if you've already pre-resolved the requirements with uv pip compile, as hopefully the biggest bottleneck is IO. I should be able to profile and see if there's any low hanging fruit in pip that can be fixed. Also should be a good scenario to see if uv can advertise being faster here or not. |
pip download is pretty ok/fast enough for my needs. I do also open multiple processes and am constrained only by network so UV won't be any faster without cache (I'm not the guy above). It is necessary for 2 reasons:
By using cache it would indeed be faster than pip for same platform downloads. Although, for my use case, I need to download cross platform, so the packages won't be there (e.g. numpy or pandas which are large plataform specific packages). |
At the risk of repeating what other people have said, to chime in with my use case (also Docker image building for deployment, reproducibility is a secondary concern for me) and perhaps shed light on why pip download is important enough to support:
Also I’d note that as a user of these toolsets it can be confusing to keep up with the proliferation of ways to do the one thing as the state of the art evolves I found an issue RE: |
Hello everyone, I wanted to contribute some additional use cases for consideration. While most discussions here focus on the cloud, my perspective comes from the embedded world. Let’s consider the 10 billion devices currently operating on cellular networks, of which 1.8 billion are IoT/M2M devices. Many of these devices do not have access to "unlimited good bandwidth" but still require software updates, CI, etc. When using Python, one way to accelerate these deployments is by pre-fetch PyPI packages ahead of time. This is not about supporting file_url="https://files.pythonhosted.org/packages/5e/31/d49a3dff9c4ca6e6c09c2c5fea95f58cf59cc3cd4f0d557069c7dccd6f57/tensorflow-2.7.4-cp39-cp39-manylinux2010_x86_64.whl"
wget --continue --quiet -P . "$file_url" And then the actual software deployment could use this pre-fetched file and get the rest of the dependencies from the PyPI directly. By enabling the UV package manager to use these pre-fetched/cached files, deployments become more efficient. Also, it's intuitive for a user to think about UV operations as downloading, storing, and installing. |
I don't think this is the same issue? And should already be possible. If you have acquired the wheels you can install directly from them (or use
Also, pretty sure you can install from uv on your base machine, copy the cache to other devices, and then point uv cache to the copied directory, this should use even less resources (CPU and storage) on your IoT devices, as there's no extra step of unzipping the contents and storing it somewhere. |
Unfortunately it's not possible. I have better explanation here if you are interested #7296.
Installing a single package isn't the goal here but managing all the dependencies with uv. Basically doing something like this
In this use case, the biggest problem is data usage (for some devices, you pay per MB of usage), and the UV's cache contains the unzipped versions of wheels. For example, TensorFlow, which is ~400MB, expands into GB of data |
To copy onto the device before it does any downloading or the total amount of storage on the device? Because if it's the total amount of storage on the device then you will use less space by copying the cache, because copying the wheel will take up the wheel + the install, whereas copying the cache will just be the install, and the the site-packages location will just hard link to the cache and use no additional space. If it's to initially copy onto could zip the uv cache up and then have a small script that unzips it into the actual uv cache folder and then delete the zip. I'm not saying it wouldn't be helpful for uv to have a download function and what you propose in #7296, just spitballing solutions with existings tools. |
According to prior discussion, instead of having two $ # auto build sdists.
$ uv collect
$ # prevent include prebuilt wheels.
$ uv collect --requires-build-only
$ # cross-platform lockfile resolving.
$ uv collect --python-platform windows
$ # avoid build sdists, but clone them as they are (for sdist wheels copy `tar.gz` file,
$ # for git dependency clone repo/subdirectory, for editables copy folder).
$ uv collect --clone
# by default, it collects all indexes, we can limit it.
$ uv collect --exclude-index pypi # collect all others except `pypi`
$ uv collect --index internal_pypi # only `internal_pypi` I also think it could be integrated to some part of uv's caching system, e.g. I hope it help to this issue and #1681 to going forward. |
This would be especially useful for buildng docker images.
You could then rely on
uv
for a quick resolve and use a simplepip install --no-deps --find-links
in yourDockerfile
.The text was updated successfully, but these errors were encountered: