Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG: pip install mistral_inference: ModuleNotFoundError: No module named 'torch' #228

Open
chrisstankevitz opened this issue Oct 4, 2024 · 3 comments
Labels
bug Something isn't working

Comments

@chrisstankevitz
Copy link

chrisstankevitz commented Oct 4, 2024

Python -VV

(mistral) C:\Work\2024-10-04_mistral>python -VV
Python 3.12.6 (tags/v3.12.6:a4a2d2b, Sep  6 2024, 20:11:23) [MSC v.1940 64 bit (AMD64)]

Pip Freeze

(mistral) C:\Work\2024-10-04_mistral>pip freeze

Reproduction Steps

  1. Buy a new computer
  2. Install Windows 10 22H2
  3. Install python-3.12.6-amd64.exe (select option to add python to path)
  4. Run these commands:
    1. python -m venv mistral
    2. mistral\Scripts\activate.bat
    3. pip install --upgrade pip
    4. pip install mistral_inference

Expected Behavior

python package "mistral_inference" should be installed.

Additional Context

This fails during installation of xformers with "no module named 'torch'".

C:\Work\2024-10-04_mistral>python -m venv mistral

C:\Work\2024-10-04_mistral>mistral\Scripts\activate.bat

(mistral) C:\Work\2024-10-04_mistral>pip install --upgrade pip
Requirement already satisfied: pip in c:\work\2024-10-04_mistral\mistral\lib\site-packages (24.2)

(mistral) C:\Work\2024-10-04_mistral>pip install mistral_inference
Collecting mistral_inference
  Using cached mistral_inference-1.4.0-py3-none-any.whl.metadata (14 kB)
Collecting fire>=0.6.0 (from mistral_inference)
  Using cached fire-0.7.0.tar.gz (87 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting mistral_common>=1.4.0 (from mistral_inference)
  Using cached mistral_common-1.4.4-py3-none-any.whl.metadata (4.6 kB)
Collecting pillow>=10.3.0 (from mistral_inference)
  Using cached pillow-10.4.0-cp312-cp312-win_amd64.whl.metadata (9.3 kB)
Collecting safetensors>=0.4.0 (from mistral_inference)
  Using cached safetensors-0.4.5-cp312-none-win_amd64.whl.metadata (3.9 kB)
Collecting simple-parsing>=0.1.5 (from mistral_inference)
  Using cached simple_parsing-0.1.6-py3-none-any.whl.metadata (7.3 kB)
Collecting xformers>=0.0.24 (from mistral_inference)
  Using cached xformers-0.0.28.post1.tar.gz (7.8 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error

  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [20 lines of output]
      Traceback (most recent call last):
        File "C:\Work\2024-10-04_mistral\mistral\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 353, in <module>
          main()
        File "C:\Work\2024-10-04_mistral\mistral\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "C:\Work\2024-10-04_mistral\mistral\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 118, in get_requires_for_build_wheel
          return hook(config_settings)
                 ^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\cstankevitz\AppData\Local\Temp\pip-build-env-fwupzqhw\overlay\Lib\site-packages\setuptools\build_meta.py", line 332, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=[])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\cstankevitz\AppData\Local\Temp\pip-build-env-fwupzqhw\overlay\Lib\site-packages\setuptools\build_meta.py", line 302, in _get_build_requires
          self.run_setup()
        File "C:\Users\cstankevitz\AppData\Local\Temp\pip-build-env-fwupzqhw\overlay\Lib\site-packages\setuptools\build_meta.py", line 503, in run_setup
          super().run_setup(setup_script=setup_script)
        File "C:\Users\cstankevitz\AppData\Local\Temp\pip-build-env-fwupzqhw\overlay\Lib\site-packages\setuptools\build_meta.py", line 318, in run_setup
          exec(code, locals())
        File "<string>", line 24, in <module>
      ModuleNotFoundError: No module named 'torch'
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

(mistral) C:\Work\2024-10-04_mistral>

Suggested Solutions

Perhaps mistral_inference should require torch? Or perhaps xformers should require torch?

@chrisstankevitz chrisstankevitz added the bug Something isn't working label Oct 4, 2024
@Mindrix95
Copy link

Hello, i got the exact same problem. Please help !

@viragumathe5
Copy link

viragumathe5 commented Oct 15, 2024

The issue is due to the xformers

xfomers is downgrading torch and triton also it has incompatibility issues with these packages hence it won't work
You can read more about this issue here mlfoundations/open_lm#267

for now if you are installing mistral-inference from this repo, clone it and do this small changes in your poetry.lock file

[[package]]
name = "xformers"
version = "0.0.28"
description = "XFormers: A collection of composable Transformer building blocks."
optional = false
python-versions = ">=3.7"
files = []

It will bypass the versioning issues with the xformers and other libs.

@chrisstankevitz
Copy link
Author

I got this to work by pre-installing torch. I had to be careful to install a version of torch that would work with the version of xformers that would be subsequently installed. These instructions also install a CUDA-enabled version of torch:

pip install torch==2.4.1+cu124 torchvision==0.19.1+cu124 torchaudio==2.4.1+cu124 --index-url https://download.pytorch.org/whl/cu124
pip install mistral-inference

Versions I ended up using:

  • mistral_inference==1.5.0
  • torch==2.4.1+cu124
  • torchaudio==2.4.1+cu124
  • torchvision==0.19.1+cu124
  • xformers==0.0.28.post1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants