How to build llama-cpp-python with Metal support? #339
-
I follow this: https://github.com/abetlen/llama-cpp-python#development
However when loading model in text-generation-webui with n-gpu-layers set to 1 i get following errors:
Any idea what is wrong and how to fix it? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
ok, found this topic: #317 |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
ok, found this topic: #317