Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proxy settings for a project #474

Open
amirhosseindavoody opened this issue Nov 17, 2023 · 12 comments
Open

Proxy settings for a project #474

amirhosseindavoody opened this issue Nov 17, 2023 · 12 comments
Labels
✨ enhancement Feature request

Comments

@amirhosseindavoody
Copy link

Problem description

Add settings for http_proxy and https_proxy in project specific pixi.toml files.

@amirhosseindavoody amirhosseindavoody added the ✨ enhancement Feature request label Nov 17, 2023
@baszalmstra
Copy link
Contributor

I think we should not include the proxy settings in the pixi.toml. The pixi.toml describes project settings whereas I think proxy settings are user / configuration of pixi itself (not the project). We have been discussing having a global configuration that would allow you to configure these kinds of settings (like cache directory, etc), that would be a good fit to store the proxy settings. (see #172)

Pixi should already respect the HTTP_PROXY and HTTPS_PROXY environment variables though.

@amirhosseindavoody
Copy link
Author

Maybe a little bit of context on why I proposed this will help.

I work in a team where most users are not comfortable in writing scripts and preparing the python environment.

I found pixi very easy to prepare an environment for users to download a python script from our private github repo and just call the pixi shell command to install all the dependencies. Most people don't even have pixi installed in their environment so I included the pixi binary file in the Github repo as well. So, everything is local to the project folder. The problem happens during the package download since the user does not know how (or want) to setup a global proxy setting.

@maawoo
Copy link

maawoo commented May 7, 2024

Hi all,

I have a related issue or rather a comment on:

Pixi should already respect the HTTP_PROXY and HTTPS_PROXY environment variables though.

It seems like this is not the case when pypi packages are installed. I have commented in another issue (here) about my case but I guess it fits much better here. Usage of these proxy environment variables is quite common in university HPC systems such as ours and I'm not able to use pixi install when my pyproject.toml/pixi.toml includes pypi dependencies:

pixi install --verbose
 INFO pixi::config: Global config not found at /etc/pixi/config.toml
 INFO pixi::config: Global config not found at /home/du23yow/.config/pixi/config.toml
 INFO pixi::config: Global config not found at /home/du23yow/.config/pixi/config.toml
 INFO pixi::config: Global config not found at /home/du23yow/.pixi/config.toml
 INFO pixi::environment: verifying prefix location is unchanged, with prefix file: /home/du23yow/0000_gedixr_development/.pixi/envs/default/conda-meta/pixi_env_prefix
 INFO pixi::lock_file::outdated: environment 'default' is out of date because it does not exist in the lock-file.
 INFO pixi::lock_file::resolve: uv keyring provider is disableddefault:linux-64     [00:00:00] [────────────────────]    0/21   extracting pypi packagesdefault:osx-64       [00:00:00] [────────────────────]    0/13   extracting pypi packages                                                                                                                × failed to download pypi name mapping
  ├─▶ Middleware error: File still doesn't exist
  ├─▶ File still doesn't exist
  ╰─▶ No such file or directory (os error 2)

No problem at all when only conda-forge dependencies are listed in the file. Pixi version is 0.21.1.

@javs1979
Copy link

javs1979 commented Jun 7, 2024

Hello there,

I same a similar-ish issue here.

pixi.toml

[project]
name = "hello-pixi-single-pypi-package"
version = "0.1.0"
description = "Add a short description here"
authors = ["Me"]
channels = ["https://my_corporate_artifactory_instance/art/conda"]
platforms = ["win-64"]

[pypi-options]
index-url = "https://my_corporate_artifactory_instance/art/api/pypi/my-pypi/simple/"

[tasks]

[dependencies]
python = "==3.11"

[pypi-dependencies]
rich = "*"

Context

  • Using pixi 0.23.0.
  • Corporate Artifactory instance is always used to consume all the packages I need (conda, Python, R, .NET, you name it). I can vanilla pip install or conda install anything without a hitch. NO proxy authentication required.
  • Project is configured with bare minimum configuration (from pixi init that propagates global config.toml correct configuration properties).

Expected behavior

pixi install with [pypi-dependencies] should install just like [dependencies] in all scenarios.

Actual behavior

  • pixi install in project with only [dependencies] works like a charm.
  • pixi install in project with any dependency in [pypi-dependencies] stalls about here (using -vvv for debugging)...

[...]
DEBUG resolve_conda{group=default platform=win-64}: reqwest::connect: starting new connection: https://raw.githubusercontent.com/
DEBUG hyper_util::client::legacy::connect::dns: resolving host="raw.githubusercontent.com"
DEBUG resolve_conda{group=default platform=win-64}: hyper_util::client::legacy::connect::http: connecting to 185.199.108.133:443
DEBUG resolve_conda{group=default platform=win-64}: hyper_util::client::legacy::connect::http: connecting to 185.199.109.133:443
DEBUG resolve_conda{group=default platform=win-64}: hyper_util::client::legacy::connect::http: connecting to 185.199.110.133:443
WARN resolve_conda{group=default platform=win-64}: reqwest_retry::middleware: Retry attempt #0. Sleeping 746.933193ms before the next attempt
DEBUG resolve_conda{group=default platform=win-64}: hyper_util::client::legacy::connect::http: connecting to 185.199.110.133:443
⠦ updating lock-file [00:00:49] [────────────────────────────────────────] 0/4
⠓ default:win-64 [00:00:49] extracting pypi packages
[...]

...and then, after multiple reqwest_retry::middleware: Retry attempts, here goes...

× failed to download pypi mapping from https://raw.githubusercontent.com/prefix-dev/parselmouth/main/files/compressed_mapping.json location
├─▶ Middleware error: Request error: error sending request for url (https://raw.githubusercontent.com/prefix-dev/parselmouth/main/files/
│ compressed_mapping.json)
├─▶ Request error: error sending request for url (https://raw.githubusercontent.com/prefix-dev/parselmouth/main/files/compressed_mapping.json)
├─▶ error sending request for url (https://raw.githubusercontent.com/prefix-dev/parselmouth/main/files/compressed_mapping.json)
├─▶ client error (Connect)
├─▶ tcp connect error: Une tentative de connexion a échoué car le parti connecté n’a pas répondu convenablement au-delà d’une certaine durée ou une connexion établie a échoué car l’hôte de connexion n’a pas répondu. (os error 10060)
╰─▶ Une tentative de connexion a échoué car le parti connecté n’a pas répondu convenablement au-delà d’une certaine durée ou une connexion établie a échoué car l’hôte de connexion n’a pas répondu. (os error 10060)

Questions

  • How [dependencies] can be installed so easily while [pypi-dependencies] fail miserably ?
  • Why the connection attempt at raw.githubusercontent.com ? I expect connections to corporate Artifactory instance and nothing else.
  • If, for some reason, that connection to raw.githubusercontent.com (or any other url, for that matter) is absolutely required, how can I set corporate proxy settings just for this ?

UPDATE : I get the problem is closely related to parselmouth, the conda mapping runner.

@javs1979
Copy link

javs1979 commented Jun 8, 2024

Well, after a few hours scratching my head, it seems I found a workaround 🤞 However, unless I misunderstand some underlying mechanics in pixi, the workaround appears counter-intuitive.

In pixi.toml, I added this conda-pypi-map...

conda-pypi-map = { "https://my_corporate_artifactory_instance/art/conda" = "mappings.json" }

...and the mappings.json file, in project root, contains a single Python package reference...

{ "rich": "rich" }

...and voilà, pixi install does its magic by installing both conda and pypi packages 🎉 . Maybe the contents (or even the existence) of mappings.json is irrelevant, but only the presence of conda-pypi-map with dummy .json reference is, although I didn't check this.

On a side note, in reference to #474 (comment), @maawoo, I tested different scenarios with and without proxy environment variables and those are properly propagated to pixi. Maybe my workaround will be of any help in your situation ? After all, we have middleware and (possibly) pypi name mapping errors in common.

@javs1979
Copy link

javs1979 commented Jun 8, 2024

@amirhosseindavoody, @maawoo, in regard to proxy settings, I agree with @baszalmstra ; using preexisting proxy environment variables (http_proxy, https_proxy, no_proxy) might just be the way to go.

Now, based on my discoveries in #474 (comment), it seems application logic associated with conda-pypi-map somehow short-circuits parselmouth (online by default ?) mapping checks which, in turn, requires direct Internet access to prefix-dev assets presently hosted in GitHub.

In my situation, it's good news because direct Internet access to GH assets is prohibited in my corporate environment. However, I don't need any mappings ; alI I want is consume conda and pypi packages and control sources of truth using appropriate channels, pypi-options, [dependencies] and [pypi-dependencies] references in pixi.toml.

Which makes me think... is there a simpler configuration of some sort (a flag, environment variable or property in pixi.toml or somewhere else) that single-handedly disables parselmouth online mapping checks, all without relying on conda-pypi-map (since no mappings are needed in my situation) ?

Any ideas, @baszalmstra ?

@javs1979
Copy link

javs1979 commented Jun 10, 2024

Well, as a follow-up to #474 (comment), it appears a mappings.json file that...

  • exists
  • is empty (but is a valid JSON)

...like...

{}

...gets the job done to circumvent parselmouth online conda mapping checks.

@ernimd
Copy link

ernimd commented Jul 5, 2024

For future visitors, modify your toml as per this snippet:

[project]
...
channels = ["conda-forge", "pytorch"]
conda-pypi-map = { pytorch = "compressed_mapping.json", conda-forge = "compressed_mapping.json", nvidia = "compressed_mapping.json" }
...

Note that nvidia channel is added later a a feature to an environment. If not present here the online http request will be sent and the install will fail...

Also as the above comment mentions, the json file can be an empty mapping...

@maawoo
Copy link

maawoo commented Aug 9, 2024

Thanks @ernimd and @javs1979 for the workaround!
It would be great to have a proper solution at some point though. As far as I understand, when I use conda-forge = "compressed_mapping.json" with an empty JSON, pypi solve will not be able to determine which packages have already installed by conda-forge. Or am I wrong?

@ruben-arts
Copy link
Contributor

You're not wrong, but you could host the map yourself or copy the map into the local file and then it would actually work. https://github.com/prefix-dev/parselmouth/blob/main/files/compressed_mapping.json

A real fix would be a proper implementation of the purl spec. For which we've created a CEP(conda enhancement proposal):

@geoHeil
Copy link

geoHeil commented Aug 26, 2024

With this I can get the proxy to work for conda packages neatly. However, I still fail to get the pipy packages to resolve albeit setting:

conda-pypi-map = { conda-forge = "compressed_mapping.json" }

@geoHeil
Copy link

geoHeil commented Aug 27, 2024

For me it only works when

[pypi-options]
index-url 

setting the index-url 2x - once in the project settings and one more time in the project configuration - plus setting the empty mapping json.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
✨ enhancement Feature request
Projects
None yet
Development

No branches or pull requests

7 participants