You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Example: jsonschema gets upgraded, either because it's not pinned and thus installed at the latest version in a new env, or because it's pinned for various reasons (as happened in #2172). jsonschema==3.0.1 depends on six>=1.11.0, but sometimes an older version is already installed. In those cases, pip install won't upgrade six, thus breaking the install.
Proposed minimal solution: generate requirement files from pipenv lock -r after creating a venv with pipenv install -r requirements/install.pip. This has the advantage of pinning all the dependencies, thus leaving no room for discrepancies among different environments. In our example, if jsonschema==3.0.1 is pinned in install.pip, six==1.12.0 will be pinned in the new install.pip.
By generating a standard pip requirement file, we do not need to change our production deploy process (just do a pip install -r). Still, there are some impacts:
we need to switch to pipenv on our local envs and maintain a Pipfile and Pipfile.lock beside the usual requirement file (or just do generate the "extended" requirements once with pipenv and then still rely on pip? I'm quite sure this will lead to troubles, and it's cleaner to just list our core dependencies in a Pipfile)
https://pyup.io will try to update each and every dependencies in our project, which will quickly become a mess. There's seems to be partial support of Pipfile though Support Pipfiles pyupio/pyup#197
we need to generate the requirements files every time we add something to Pipfile: maybe this be automated at release time?
We could also switch completely to pipenv, including in our deploy process, but I feel this is quite touchy.
The text was updated successfully, but these errors were encountered:
Following #2182 and #2172.
We regularly have trouble with transitive dependencies breaking "randomly" on different environments:
pip install udata
or using https://github.com/opendatateam/docker-udata)Example:
jsonschema
gets upgraded, either because it's not pinned and thus installed at the latest version in a new env, or because it's pinned for various reasons (as happened in #2172).jsonschema==3.0.1
depends onsix>=1.11.0
, but sometimes an older version is already installed. In those cases,pip install
won't upgradesix
, thus breaking the install.Proposed minimal solution: generate requirement files from
pipenv lock -r
after creating a venv withpipenv install -r requirements/install.pip
. This has the advantage of pinning all the dependencies, thus leaving no room for discrepancies among different environments. In our example, ifjsonschema==3.0.1
is pinned ininstall.pip
,six==1.12.0
will be pinned in the newinstall.pip
.By generating a standard pip requirement file, we do not need to change our production deploy process (just do a
pip install -r
). Still, there are some impacts:pipenv
on our local envs and maintain aPipfile
andPipfile.lock
beside the usual requirement file (or just do generate the "extended" requirements once withpipenv
and then still rely onpip
? I'm quite sure this will lead to troubles, and it's cleaner to just list our core dependencies in aPipfile
)Pipfile
though Support Pipfiles pyupio/pyup#197Pipfile
: maybe this be automated at release time?We could also switch completely to
pipenv
, including in our deploy process, but I feel this is quite touchy.The text was updated successfully, but these errors were encountered: