-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mac-m1 yi bug #36
Comments
Last login: Mon Jan 22 22:13:11 on ttys011 The above exception was the direct cause of the following exception: Traceback (most recent call last): Cannot access gated repo for url https://hf-mirror.com/api/models/meta-llama/Llama-2-13b-hf/revision/main. Consider using During handling of the above exception, another exception occurred: Traceback (most recent call last): (base) zhangyixin@zhangyixin llama % cd ../llama.cpp Consider using During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): Consider using yi-34b-chat.Q8_0.gguf: 12%|█████████████████████ | 4.36G/36.5G [31:21<5:52:39, 1.52MB/s] yi-34b-chat.Q8_0.gguf: 12%|█████████████████████ | 4.37G/36.5G [31:30<6:21:15, 1.41MB/s] Consider using During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): Consider using During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): Consider using (base) zhangyixin@zhangyixin llama.cpp % chmod +x /Users/zhangyixin/Desktop/llama.cpp/TheBloke/Yi-34B-Chat-GGUF/yi-34b-chat.Q8_0.gguf Log start |
llm_load_tensors: ggml ctx size = 0.41 MiB
ggml_backend_metal_buffer_from_ptr: error: failed to allocate buffer, size = 0.00 MiB
llama_model_load: error loading model: failed to allocate buffer
llama_load_model_from_file: failed to load model
llama_init_from_gpt_params: error: failed to load model '/Users/zhangyixin/Desktop/llama.cpp/TheBloke/Yi-34B-Chat-GGUF/yi-34b-chat.Q8_0.gguf'
main: error: unable to load model
1021 huggingface-cli download --token YOUR_TOKEN --resume-download --local-dir-use-symlinks False TheBloke/Yi-34B-Chat-GGUF --include "yi-34b-chat.Q8_0.gguf" --local-dir TheBloke/Yi-34B-Chat-GGUF\n
1022 chmod +x /Users/zhangyixin/Desktop/llama.cpp/TheBloke/Yi-34B-Chat-GGUF/yi-34b-chat.Q8_0.gguf
1023 /Users/zhangyixin/Desktop/llama.cpp/TheBloke/Yi-34B-Chat-GGUF/yi-34b-chat.Q8_0.gguf
1024 ./main --frequency-penalty 0.5 --frequency-penalty 0.5 --top-k 5 --top-p 0.9 -m /Users/zhangyixin/Desktop/llama.cpp/TheBloke/Yi-34B-Chat-GGUF/yi-34b-chat.Q8_0.gguf -p "Building a website can be done in 10 simple steps:\nStep 1:" -n 400 -e\n
1025 hist
The text was updated successfully, but these errors were encountered: