You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Install python-3.12.6-amd64.exe (select option to add python to path)
Run these commands:
python -m venv mistral
mistral\Scripts\activate.bat
pip install --upgrade pip
pip install mistral_inference
Expected Behavior
python package "mistral_inference" should be installed.
Additional Context
This fails during installation of xformers with "no module named 'torch'".
C:\Work\2024-10-04_mistral>python -m venv mistral
C:\Work\2024-10-04_mistral>mistral\Scripts\activate.bat
(mistral) C:\Work\2024-10-04_mistral>pip install --upgrade pip
Requirement already satisfied: pip in c:\work\2024-10-04_mistral\mistral\lib\site-packages (24.2)
(mistral) C:\Work\2024-10-04_mistral>pip install mistral_inference
Collecting mistral_inference
Using cached mistral_inference-1.4.0-py3-none-any.whl.metadata (14 kB)
Collecting fire>=0.6.0 (from mistral_inference)
Using cached fire-0.7.0.tar.gz (87 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting mistral_common>=1.4.0 (from mistral_inference)
Using cached mistral_common-1.4.4-py3-none-any.whl.metadata (4.6 kB)
Collecting pillow>=10.3.0 (from mistral_inference)
Using cached pillow-10.4.0-cp312-cp312-win_amd64.whl.metadata (9.3 kB)
Collecting safetensors>=0.4.0 (from mistral_inference)
Using cached safetensors-0.4.5-cp312-none-win_amd64.whl.metadata (3.9 kB)
Collecting simple-parsing>=0.1.5 (from mistral_inference)
Using cached simple_parsing-0.1.6-py3-none-any.whl.metadata (7.3 kB)
Collecting xformers>=0.0.24 (from mistral_inference)
Using cached xformers-0.0.28.post1.tar.gz (7.8 MB)
Installing build dependencies ... done
Getting requirements to build wheel ... error
error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [20 lines of output]
Traceback (most recent call last):
File "C:\Work\2024-10-04_mistral\mistral\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 353, in <module>
main()
File "C:\Work\2024-10-04_mistral\mistral\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Work\2024-10-04_mistral\mistral\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 118, in get_requires_for_build_wheel
return hook(config_settings)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\cstankevitz\AppData\Local\Temp\pip-build-env-fwupzqhw\overlay\Lib\site-packages\setuptools\build_meta.py", line 332, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=[])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\cstankevitz\AppData\Local\Temp\pip-build-env-fwupzqhw\overlay\Lib\site-packages\setuptools\build_meta.py", line 302, in _get_build_requires
self.run_setup()
File "C:\Users\cstankevitz\AppData\Local\Temp\pip-build-env-fwupzqhw\overlay\Lib\site-packages\setuptools\build_meta.py", line 503, in run_setup
super().run_setup(setup_script=setup_script)
File "C:\Users\cstankevitz\AppData\Local\Temp\pip-build-env-fwupzqhw\overlay\Lib\site-packages\setuptools\build_meta.py", line 318, in run_setup
exec(code, locals())
File "<string>", line 24, in <module>
ModuleNotFoundError: No module named 'torch'
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
(mistral) C:\Work\2024-10-04_mistral>
Suggested Solutions
Perhaps mistral_inference should require torch? Or perhaps xformers should require torch?
The text was updated successfully, but these errors were encountered:
xfomers is downgrading torch and triton also it has incompatibility issues with these packages hence it won't work
You can read more about this issue here mlfoundations/open_lm#267
for now if you are installing mistral-inference from this repo, clone it and do this small changes in your poetry.lock file
[[package]]
name = "xformers"
version = "0.0.28"
description = "XFormers: A collection of composable Transformer building blocks."
optional = false
python-versions = ">=3.7"
files = []
It will bypass the versioning issues with the xformers and other libs.
I got this to work by pre-installing torch. I had to be careful to install a version of torch that would work with the version of xformers that would be subsequently installed. These instructions also install a CUDA-enabled version of torch:
Python -VV
Pip Freeze
Reproduction Steps
python-3.12.6-amd64.exe
(select option to add python to path)python -m venv mistral
mistral\Scripts\activate.bat
pip install --upgrade pip
pip install mistral_inference
Expected Behavior
python package "mistral_inference" should be installed.
Additional Context
This fails during installation of xformers with "no module named 'torch'".
Suggested Solutions
Perhaps mistral_inference should require torch? Or perhaps xformers should require torch?
The text was updated successfully, but these errors were encountered: