You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Operating System:ubuntu2204
I have already installed both gcc and g++ on my system.
(inference) user@user-X11DAi-N:/pyprogram/inference$ gcc --version
gcc (Ubuntu 11.4.0-1ubuntu122.04) 11.4.0
Copyright (C) 2021 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
(inference) user@user-X11DAi-N:/pyprogram/inference$ g++ --version
g++-11 (Ubuntu 11.4.0-1ubuntu122.04) 11.4.0
Copyright (C) 2021 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
full command: /home/user/anaconda3/envs/inference/bin/python /home/user/anaconda3/envs/inference/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py build_wheel /tmp/tmp77od368h
cwd: /tmp/pip-install-wiqu4fsn/llama-cpp-python_f88e9e6ee6de4f55a14df58cf7e595ae
Building wheel for llama-cpp-python (pyproject.toml) ... error
ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (llama-cpp-python)
The text was updated successfully, but these errors were encountered:
Operating System:ubuntu2204
I have already installed both gcc and g++ on my system.
(inference) user@user-X11DAi-N:
/pyprogram/inference$ gcc --version22.04) 11.4.0gcc (Ubuntu 11.4.0-1ubuntu1
Copyright (C) 2021 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
(inference) user@user-X11DAi-N:
/pyprogram/inference$ g++ --version22.04) 11.4.0g++-11 (Ubuntu 11.4.0-1ubuntu1
Copyright (C) 2021 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
(inference) user@user-X11DAi-N:~/pyprogram/inference$ pip install llama-cpp-python --verbose
Using pip 24.2 from /home/user/anaconda3/envs/inference/lib/python3.10/site-packages/pip (python 3.10)
Looking in indexes: http://mirrors.aliyun.com/pypi/simple/
Collecting llama-cpp-python
Using cached http://mirrors.aliyun.com/pypi/packages/1f/19/89836022affc1bf470e2485e28872b489254a66fe587155edba731a07112/llama_cpp_python-0.2.90.tar.gz (63.8 MB)
Running command pip subprocess to install build dependencies
Using pip 24.2 from /home/user/anaconda3/envs/inference/lib/python3.10/site-packages/pip (python 3.10)
Looking in indexes: http://mirrors.aliyun.com/pypi/simple/
Collecting scikit-build-core>=0.9.2 (from scikit-build-core[pyproject]>=0.9.2)
Using cached http://mirrors.aliyun.com/pypi/packages/95/a6/fbf3a66ed198138163059f2448ef01a473260cdb29ba58f178769bac7216/scikit_build_core-0.10.6-py3-none-any.whl (165 kB)
Collecting exceptiongroup>=1.0 (from scikit-build-core>=0.9.2->scikit-build-core[pyproject]>=0.9.2)
Using cached http://mirrors.aliyun.com/pypi/packages/02/cc/b7e31358aac6ed1ef2bb790a9746ac2c69bcb3c8588b41616914eb106eaf/exceptiongroup-1.2.2-py3-none-any.whl (16 kB)
Collecting packaging>=21.3 (from scikit-build-core>=0.9.2->scikit-build-core[pyproject]>=0.9.2)
Using cached http://mirrors.aliyun.com/pypi/packages/08/aa/cc0199a5f0ad350994d660967a8efb233fe0416e4639146c089643407ce6/packaging-24.1-py3-none-any.whl (53 kB)
Collecting pathspec>=0.10.1 (from scikit-build-core>=0.9.2->scikit-build-core[pyproject]>=0.9.2)
Using cached http://mirrors.aliyun.com/pypi/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl (31 kB)
Collecting tomli>=1.2.2 (from scikit-build-core>=0.9.2->scikit-build-core[pyproject]>=0.9.2)
Using cached http://mirrors.aliyun.com/pypi/packages/97/75/10a9ebee3fd790d20926a90a2547f0bf78f371b2f13aa822c759680ca7b9/tomli-2.0.1-py3-none-any.whl (12 kB)
Installing collected packages: tomli, pathspec, packaging, exceptiongroup, scikit-build-core
Successfully installed exceptiongroup-1.2.2 packaging-24.1 pathspec-0.12.1 scikit-build-core-0.10.6 tomli-2.0.1
Installing build dependencies ... done
Running command Getting requirements to build wheel
Getting requirements to build wheel ... done
Running command pip subprocess to install backend dependencies
Using pip 24.2 from /home/user/anaconda3/envs/inference/lib/python3.10/site-packages/pip (python 3.10)
Looking in indexes: http://mirrors.aliyun.com/pypi/simple/
Collecting cmake>=3.21
Using cached http://mirrors.aliyun.com/pypi/packages/b0/bd/c92eb92654e41483fc79f309863c55647bcacbbc9b1312a9684d6e0ba725/cmake-3.30.3-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (26.9 MB)
Collecting ninja>=1.5
Using cached http://mirrors.aliyun.com/pypi/packages/6d/92/8d7aebd4430ab5ff65df2bfee6d5745f95c004284db2d8ca76dcbfd9de47/ninja-1.11.1.1-py2.py3-none-manylinux1_x86_64.manylinux_2_5_x86_64.whl (307 kB)
Installing collected packages: ninja, cmake
Creating /tmp/pip-build-env-qy09681h/normal/bin
changing mode of /tmp/pip-build-env-qy09681h/normal/bin/ninja to 775
changing mode of /tmp/pip-build-env-qy09681h/normal/bin/cmake to 775
changing mode of /tmp/pip-build-env-qy09681h/normal/bin/cpack to 775
changing mode of /tmp/pip-build-env-qy09681h/normal/bin/ctest to 775
Successfully installed cmake-3.30.3 ninja-1.11.1.1
Installing backend dependencies ... done
Running command Preparing metadata (pyproject.toml)
*** scikit-build-core 0.10.6 using CMake 3.30.3 (metadata_wheel)
Preparing metadata (pyproject.toml) ... done
Collecting typing-extensions>=4.5.0 (from llama-cpp-python)
Using cached http://mirrors.aliyun.com/pypi/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl (37 kB)
Collecting numpy>=1.20.0 (from llama-cpp-python)
Using cached http://mirrors.aliyun.com/pypi/packages/7d/4b/a509d346fffede6120cc17610cc500819417ee9c3da7f08d9aaf15cab2a3/numpy-2.1.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (16.3 MB)
Collecting diskcache>=5.6.1 (from llama-cpp-python)
Using cached http://mirrors.aliyun.com/pypi/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl (45 kB)
Collecting jinja2>=2.11.3 (from llama-cpp-python)
Using cached http://mirrors.aliyun.com/pypi/packages/31/80/3a54838c3fb461f6fec263ebf3a3a41771bd05190238de3486aae8540c36/jinja2-3.1.4-py3-none-any.whl (133 kB)
Collecting MarkupSafe>=2.0 (from jinja2>=2.11.3->llama-cpp-python)
Using cached http://mirrors.aliyun.com/pypi/packages/7c/52/2b1b570f6b8b803cef5ac28fdf78c0da318916c7d2fe9402a84d591b394c/MarkupSafe-2.1.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (25 kB)
Building wheels for collected packages: llama-cpp-python
Running command Building wheel for llama-cpp-python (pyproject.toml)
*** scikit-build-core 0.10.6 using CMake 3.30.3 (wheel)
*** Configuring CMake...
loading initial cache file /tmp/tmpcfbusg5j/build/CMakeInit.txt
-- The C compiler identification is GNU 11.4.0
CMake Error at /tmp/pip-build-env-qy09681h/normal/lib/python3.10/site-packages/cmake/data/share/cmake-3.30/Modules/CMakeDetermineCXXCompiler.cmake:48 (message):
Could not find compiler set in environment variable CXX:
Call Stack (most recent call first):
CMakeLists.txt:3 (project)
CMake Error: CMAKE_CXX_COMPILER not set, after EnableLanguage
-- Configuring incomplete, errors occurred!
*** CMake configuration failed
error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
full command: /home/user/anaconda3/envs/inference/bin/python /home/user/anaconda3/envs/inference/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py build_wheel /tmp/tmp77od368h
cwd: /tmp/pip-install-wiqu4fsn/llama-cpp-python_f88e9e6ee6de4f55a14df58cf7e595ae
Building wheel for llama-cpp-python (pyproject.toml) ... error
ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (llama-cpp-python)
The text was updated successfully, but these errors were encountered: