Skip to content

python3Packages.llama-cpp-python: disable pythonImportsCheck when cudaSupport is enabled#465751

Merged
ConnorBaker merged 1 commit intoNixOS:masterfrom
GaetanLepage:llama-cpp-python
Dec 7, 2025
Merged

python3Packages.llama-cpp-python: disable pythonImportsCheck when cudaSupport is enabled#465751
ConnorBaker merged 1 commit intoNixOS:masterfrom
GaetanLepage:llama-cpp-python

Conversation

@GaetanLepage
Copy link
Contributor

@GaetanLepage GaetanLepage commented Nov 27, 2025

Things done

llama_cpp fails to import in the sandbox when cudaSupport is enabled.
Indeed, libllama.so is dlopen-ed at import time and, when cudaSupport is enabled, loads libcuda.so which should be provided by the driver (in /run/opengl-driver/lib).
This cannot work inside the sandbox.

I also added autoAddDriverRunpath so that the driver runpath is effectively added to libllama.so's rpath.

https://hydra.nixos-cuda.org/build/26753/log/tail

cc @SomeoneSerge

  • Built on platform:
    • x86_64-linux
    • aarch64-linux
    • x86_64-darwin
    • aarch64-darwin
  • Tested, as applicable:
  • Ran nixpkgs-review on this PR. See nixpkgs-review usage.
  • Tested basic functionality of all binary files, usually in ./result/bin/.
  • Nixpkgs Release Notes
    • Package update: when the change is major or breaking.
  • NixOS Release Notes
    • Module addition: when adding a new NixOS module.
    • Module update: when the change is significant.
  • Fits CONTRIBUTING.md, pkgs/README.md, maintainers/README.md and other READMEs.

Add a 👍 reaction to pull requests you find important.

@nixpkgs-ci nixpkgs-ci bot requested review from booxter and kirillrdy November 27, 2025 22:18
@nixpkgs-ci nixpkgs-ci bot added 10.rebuild-linux: 11-100 This PR causes between 11 and 100 packages to rebuild on Linux. 10.rebuild-darwin: 1-10 This PR causes between 1 and 10 packages to rebuild on Darwin. 6.topic: python Python is a high-level, general-purpose programming language. labels Nov 27, 2025
@GaetanLepage
Copy link
Contributor Author

Well... Unfortunately, pythonImportsCheck of all packages that depend on and import llama-cpp-python will fail too... Not ideal :/

@GaetanLepage
Copy link
Contributor Author

nixpkgs-review result

Generated using nixpkgs-review.

Command: nixpkgs-review pr 465751 --extra-nixpkgs-config '{ allowUnfree = true; cudaSupport = true; }'
Commit: bc4e59413e03dd64cdd4193ed2ffbff6d232dd1d


x86_64-linux

❌ 20 packages failed to build:
  • python312Packages.kserve
  • python312Packages.kserve.dist
  • python312Packages.llm-gguf
  • python312Packages.llm-gguf.dist
  • python312Packages.outlines
  • python312Packages.outlines.dist
  • python312Packages.torchrl
  • python312Packages.torchrl.dist
  • vllm (python312Packages.vllm)
  • vllm.dist (python312Packages.vllm.dist)
  • python313Packages.kserve
  • python313Packages.kserve.dist
  • python313Packages.llm-gguf
  • python313Packages.llm-gguf.dist
  • python313Packages.outlines
  • python313Packages.outlines.dist
  • python313Packages.torchrl
  • python313Packages.torchrl.dist
  • python313Packages.vllm
  • python313Packages.vllm.dist
✅ 5 packages built:
  • nixpkgs-manual
  • python312Packages.llama-cpp-python
  • python312Packages.llama-cpp-python.dist
  • python313Packages.llama-cpp-python
  • python313Packages.llama-cpp-python.dist

aarch64-linux

❌ 4 packages failed to build:
  • python312Packages.llm-gguf
  • python312Packages.llm-gguf.dist
  • python313Packages.llm-gguf
  • python313Packages.llm-gguf.dist
✅ 5 packages built:
  • nixpkgs-manual
  • python312Packages.llama-cpp-python
  • python312Packages.llama-cpp-python.dist
  • python313Packages.llama-cpp-python
  • python313Packages.llama-cpp-python.dist

Error logs: `x86_64-linux`
python312Packages.llm-gguf
  File "<string>", line 1, in <lambda>
  File "/nix/store/fdibxyh7xcmqrc172y78awzhxs292gq1-python3-3.12.12/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 999, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "/nix/store/p1i27w3wjjhdlf8f4icgnn8fy40i6w8s-python3.12-llm-gguf-0.2/lib/python3.12/site-packages/llm_gguf.py", line 4, in <module>
    from llama_cpp import Llama
  File "/nix/store/1kkfw0ckh4hz7hpyxfgw0jyv6ivfixm4-python3.12-llama-cpp-python-0.3.16/lib/python3.12/site-packages/llama_cpp/__init__.py", line 1, in <module>
    from .llama_cpp import *
  File "/nix/store/1kkfw0ckh4hz7hpyxfgw0jyv6ivfixm4-python3.12-llama-cpp-python-0.3.16/lib/python3.12/site-packages/llama_cpp/llama_cpp.py", line 38, in <module>
    _lib = load_shared_library(_lib_base_name, _base_path)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/1kkfw0ckh4hz7hpyxfgw0jyv6ivfixm4-python3.12-llama-cpp-python-0.3.16/lib/python3.12/site-packages/llama_cpp/_ctypes_extensions.py", line 69, in load_shared_library
    raise RuntimeError(f"Failed to load shared library '{lib_path}': {e}")
RuntimeError: Failed to load shared library '/nix/store/1kkfw0ckh4hz7hpyxfgw0jyv6ivfixm4-python3.12-llama-cpp-python-0.3.16/lib/python3.12/site-packages/llama_cpp/lib/libllama.so': libcuda.so.1: cannot open shared object file: No such file or directory
python312Packages.outlines
/nix/store/fdibxyh7xcmqrc172y78awzhxs292gq1-python3-3.12.12/lib/python3.12/ctypes/__init__.py:379: in __init__
    self._handle = _dlopen(self._name, mode)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
E   OSError: libcuda.so.1: cannot open shared object file: No such file or directory

During handling of the above exception, another exception occurred:
tests/models/test_llamacpp_type_adapter.py:4: in <module>
from llama_cpp import LogitsProcessorList
/nix/store/1kkfw0ckh4hz7hpyxfgw0jyv6ivfixm4-python3.12-llama-cpp-python-0.3.16/lib/python3.12/site-packages/llama_cpp/init.py:1: in <module>
from .llama_cpp import *
/nix/store/1kkfw0ckh4hz7hpyxfgw0jyv6ivfixm4-python3.12-llama-cpp-python-0.3.16/lib/python3.12/site-packages/llama_cpp/llama_cpp.py:38: in <module>
_lib = load_shared_library(_lib_base_name, _base_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
/nix/store/1kkfw0ckh4hz7hpyxfgw0jyv6ivfixm4-python3.12-llama-cpp-python-0.3.16/lib/python3.12/site-packages/llama_cpp/_ctypes_extensions.py:69: in load_shared_library
raise RuntimeError(f"Failed to load shared library '{lib_path}': {e}")
E RuntimeError: Failed to load shared library '/nix/store/1kkfw0ckh4hz7hpyxfgw0jyv6ivfixm4-python3.12-llama-cpp-python-0.3.16/lib/python3.12/site-packages/llama_cpp/lib/libllama.so': libcuda.so.1: cannot open shared object file: No such file or directory
=========================== short test summary info ============================
ERROR tests/models/test_llamacpp_type_adapter.py - RuntimeError: Failed to load shared library '/nix/store/1kkfw0ckh4hz7hpyxfgw0jyv6ivfixm4-python3.12-llama-cpp-python-0.3.16/lib/python3.12/site-packages/llama_cpp/lib/libllama.so': libcuda.so.1: cannot open shared object file: No such file or directory
!!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!
======================= 60 deselected, 1 error in 11.29s =======================

python313Packages.llm-gguf
    import sys; import importlib; list(map(lambda mod: importlib.import_module(mod), sys.argv[1:]))
                                                       ~~~~~~~~~~~~~~~~~~~~~~~^^^^^
  File "/nix/store/3lll9y925zz9393sa59h653xik66srjb-python3-3.13.9/lib/python3.13/importlib/__init__.py", line 88, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 1027, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "/nix/store/3hbwnjvjg9cgk7mpbnnm2w2l6852q41p-python3.13-llm-gguf-0.2/lib/python3.13/site-packages/llm_gguf.py", line 4, in <module>
    from llama_cpp import Llama
  File "/nix/store/ah5qqm75z32nw41zwb40n3kyfivrnm0h-python3.13-llama-cpp-python-0.3.16/lib/python3.13/site-packages/llama_cpp/__init__.py", line 1, in <module>
    from .llama_cpp import *
  File "/nix/store/ah5qqm75z32nw41zwb40n3kyfivrnm0h-python3.13-llama-cpp-python-0.3.16/lib/python3.13/site-packages/llama_cpp/llama_cpp.py", line 38, in <module>
    _lib = load_shared_library(_lib_base_name, _base_path)
  File "/nix/store/ah5qqm75z32nw41zwb40n3kyfivrnm0h-python3.13-llama-cpp-python-0.3.16/lib/python3.13/site-packages/llama_cpp/_ctypes_extensions.py", line 69, in load_shared_library
    raise RuntimeError(f"Failed to load shared library '{lib_path}': {e}")
RuntimeError: Failed to load shared library '/nix/store/ah5qqm75z32nw41zwb40n3kyfivrnm0h-python3.13-llama-cpp-python-0.3.16/lib/python3.13/site-packages/llama_cpp/lib/libllama.so': libcuda.so.1: cannot open shared object file: No such file or directory
python313Packages.outlines
/nix/store/3lll9y925zz9393sa59h653xik66srjb-python3-3.13.9/lib/python3.13/ctypes/__init__.py:390: in __init__
    self._handle = _dlopen(self._name, mode)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
E   OSError: libcuda.so.1: cannot open shared object file: No such file or directory

During handling of the above exception, another exception occurred:
tests/models/test_llamacpp_type_adapter.py:4: in <module>
from llama_cpp import LogitsProcessorList
/nix/store/ah5qqm75z32nw41zwb40n3kyfivrnm0h-python3.13-llama-cpp-python-0.3.16/lib/python3.13/site-packages/llama_cpp/init.py:1: in <module>
from .llama_cpp import *
/nix/store/ah5qqm75z32nw41zwb40n3kyfivrnm0h-python3.13-llama-cpp-python-0.3.16/lib/python3.13/site-packages/llama_cpp/llama_cpp.py:38: in <module>
_lib = load_shared_library(_lib_base_name, _base_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
/nix/store/ah5qqm75z32nw41zwb40n3kyfivrnm0h-python3.13-llama-cpp-python-0.3.16/lib/python3.13/site-packages/llama_cpp/_ctypes_extensions.py:69: in load_shared_library
raise RuntimeError(f"Failed to load shared library '{lib_path}': {e}")
E RuntimeError: Failed to load shared library '/nix/store/ah5qqm75z32nw41zwb40n3kyfivrnm0h-python3.13-llama-cpp-python-0.3.16/lib/python3.13/site-packages/llama_cpp/lib/libllama.so': libcuda.so.1: cannot open shared object file: No such file or directory
=========================== short test summary info ============================
ERROR tests/models/test_llamacpp_type_adapter.py - RuntimeError: Failed to load shared library '/nix/store/ah5qqm75z32nw41zwb40n3kyfivrnm0h-python3.13-llama-cpp-python-0.3.16/lib/python3.13/site-packages/llama_cpp/lib/libllama.so': libcuda.so.1: cannot open shared object file: No such file or directory
!!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!
======================= 60 deselected, 1 error in 11.66s =======================


Error logs: `aarch64-linux`
python312Packages.llm-gguf
  File "<string>", line 1, in <lambda>
  File "/nix/store/yyns9w3mfdxn5jwqa3lbp461wq9nl5p7-python3-3.12.12/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 999, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "/nix/store/p8k7q987w811yk9m2paqpyy4xs91wgcm-python3.12-llm-gguf-0.2/lib/python3.12/site-packages/llm_gguf.py", line 4, in <module>
    from llama_cpp import Llama
  File "/nix/store/rwkz34fwk8qqgb1cgcyl1llcjlnkbfm1-python3.12-llama-cpp-python-0.3.16/lib/python3.12/site-packages/llama_cpp/__init__.py", line 1, in <module>
    from .llama_cpp import *
  File "/nix/store/rwkz34fwk8qqgb1cgcyl1llcjlnkbfm1-python3.12-llama-cpp-python-0.3.16/lib/python3.12/site-packages/llama_cpp/llama_cpp.py", line 38, in <module>
    _lib = load_shared_library(_lib_base_name, _base_path)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/rwkz34fwk8qqgb1cgcyl1llcjlnkbfm1-python3.12-llama-cpp-python-0.3.16/lib/python3.12/site-packages/llama_cpp/_ctypes_extensions.py", line 69, in load_shared_library
    raise RuntimeError(f"Failed to load shared library '{lib_path}': {e}")
RuntimeError: Failed to load shared library '/nix/store/rwkz34fwk8qqgb1cgcyl1llcjlnkbfm1-python3.12-llama-cpp-python-0.3.16/lib/python3.12/site-packages/llama_cpp/lib/libllama.so': libcuda.so.1: cannot open shared object file: No such file or directory
python313Packages.llm-gguf
    import sys; import importlib; list(map(lambda mod: importlib.import_module(mod), sys.argv[1:]))
                                                       ~~~~~~~~~~~~~~~~~~~~~~~^^^^^
  File "/nix/store/zc0vzc0vp0amc9iqm3cd542bmim6vh09-python3-3.13.9/lib/python3.13/importlib/__init__.py", line 88, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 1027, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "/nix/store/69j76mgh8qfvsxyzbwilpnpvw8xazfpq-python3.13-llm-gguf-0.2/lib/python3.13/site-packages/llm_gguf.py", line 4, in <module>
    from llama_cpp import Llama
  File "/nix/store/p0rk7bp5pc0n30hqlx1d8qn0qkpck7s1-python3.13-llama-cpp-python-0.3.16/lib/python3.13/site-packages/llama_cpp/__init__.py", line 1, in <module>
    from .llama_cpp import *
  File "/nix/store/p0rk7bp5pc0n30hqlx1d8qn0qkpck7s1-python3.13-llama-cpp-python-0.3.16/lib/python3.13/site-packages/llama_cpp/llama_cpp.py", line 38, in <module>
    _lib = load_shared_library(_lib_base_name, _base_path)
  File "/nix/store/p0rk7bp5pc0n30hqlx1d8qn0qkpck7s1-python3.13-llama-cpp-python-0.3.16/lib/python3.13/site-packages/llama_cpp/_ctypes_extensions.py", line 69, in load_shared_library
    raise RuntimeError(f"Failed to load shared library '{lib_path}': {e}")
RuntimeError: Failed to load shared library '/nix/store/p0rk7bp5pc0n30hqlx1d8qn0qkpck7s1-python3.13-llama-cpp-python-0.3.16/lib/python3.13/site-packages/llama_cpp/lib/libllama.so': libcuda.so.1: cannot open shared object file: No such file or directory

@GaetanLepage
Copy link
Contributor Author

nixpkgs-review result

Generated using nixpkgs-review.

Command: nixpkgs-review pr 465751
Commit: bc4e59413e03dd64cdd4193ed2ffbff6d232dd1d


x86_64-linux

✅ 24 packages built:
  • python312Packages.kserve
  • python312Packages.kserve.dist
  • python312Packages.llama-cpp-python
  • python312Packages.llama-cpp-python.dist
  • python312Packages.llm-gguf
  • python312Packages.llm-gguf.dist
  • python312Packages.outlines
  • python312Packages.outlines.dist
  • python312Packages.torchrl
  • python312Packages.torchrl.dist
  • vllm (python312Packages.vllm)
  • vllm.dist (python312Packages.vllm.dist)
  • python313Packages.kserve
  • python313Packages.kserve.dist
  • python313Packages.llama-cpp-python
  • python313Packages.llama-cpp-python.dist
  • python313Packages.llm-gguf
  • python313Packages.llm-gguf.dist
  • python313Packages.outlines
  • python313Packages.outlines.dist
  • python313Packages.torchrl
  • python313Packages.torchrl.dist
  • python313Packages.vllm
  • python313Packages.vllm.dist

aarch64-linux

✅ 12 packages built:
  • python312Packages.llama-cpp-python
  • python312Packages.llama-cpp-python.dist
  • python312Packages.llm-gguf
  • python312Packages.llm-gguf.dist
  • python312Packages.outlines
  • python312Packages.outlines.dist
  • python313Packages.llama-cpp-python
  • python313Packages.llama-cpp-python.dist
  • python313Packages.llm-gguf
  • python313Packages.llm-gguf.dist
  • python313Packages.outlines
  • python313Packages.outlines.dist

x86_64-darwin

✅ 8 packages built:
  • python312Packages.llama-cpp-python
  • python312Packages.llama-cpp-python.dist
  • python312Packages.llm-gguf
  • python312Packages.llm-gguf.dist
  • python313Packages.llama-cpp-python
  • python313Packages.llama-cpp-python.dist
  • python313Packages.llm-gguf
  • python313Packages.llm-gguf.dist

aarch64-darwin

✅ 12 packages built:
  • python312Packages.llama-cpp-python
  • python312Packages.llama-cpp-python.dist
  • python312Packages.llm-gguf
  • python312Packages.llm-gguf.dist
  • python312Packages.outlines
  • python312Packages.outlines.dist
  • python313Packages.llama-cpp-python
  • python313Packages.llama-cpp-python.dist
  • python313Packages.llm-gguf
  • python313Packages.llm-gguf.dist
  • python313Packages.outlines
  • python313Packages.outlines.dist

@GaetanLepage GaetanLepage marked this pull request as draft November 27, 2025 23:17
@SomeoneSerge
Copy link
Contributor

Well... Unfortunately, pythonImportsCheck of all packages that depend on and import llama-cpp-python will fail too.

If you really care, one approach could be to propagate a hook that, at the least, displays a meaningful error message if pythonImportsCheck fails, or, at the most, disables the check with a warning

@GaetanLepage GaetanLepage marked this pull request as ready for review December 5, 2025 13:07
@ConnorBaker ConnorBaker self-assigned this Dec 7, 2025
@ConnorBaker ConnorBaker added the 6.topic: cuda Parallel computing platform and API label Dec 7, 2025
@ConnorBaker ConnorBaker added this pull request to the merge queue Dec 7, 2025
Merged via the queue into NixOS:master with commit 0bb20d9 Dec 7, 2025
34 checks passed
@github-project-automation github-project-automation bot moved this from New to ✅ Done in CUDA Team Dec 7, 2025
@GaetanLepage GaetanLepage deleted the llama-cpp-python branch December 7, 2025 10:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

6.topic: cuda Parallel computing platform and API 6.topic: python Python is a high-level, general-purpose programming language. 10.rebuild-darwin: 1-10 This PR causes between 1 and 10 packages to rebuild on Darwin. 10.rebuild-linux: 11-100 This PR causes between 11 and 100 packages to rebuild on Linux.

Projects

Status: ✅ Done

Development

Successfully merging this pull request may close these issues.

3 participants