You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm running python3 -m garak --model_type ggml --model_name /code/llm_models/llama-bot-chat.Q5_K_M.gguf --probes encoding and I have set the GGML_MAIN_PATH=/code/llama.cpp/main in my .zshrc and sourced it.
When I run the above command I get subprocess.CalledProcessError: Command '['/code/llama.cpp/main', '-m', '/code/llm_models/llama-bot-chat.Q5_K_M.gguf', '-n', '150', '--repeat-penalty', '1.1', '--presence-penalty', '0.0', '--frequency-penalty', '0.0', '--top-k', '40', '--top-p', '0.95', '--temp', '0.8', '-s', 'None', '-p', "1,h(!0JY@l@V0(-1cRL,2)Ji#2_d9P3G(3\\@Q@sR2E!@'@l?>)ARdK-AMR[L@l-&!3+66)Ai;M$An3T-"]' returned non-zero exit status 1.
I'm unsure what I have done wrong, I tested the command on both Linux and Mac, and get the same result both times.
The text was updated successfully, but these errors were encountered:
leondz
changed the title
GGML generator
bug: GGML generator non-zero exit code
Feb 12, 2024
I'm running
python3 -m garak --model_type ggml --model_name /code/llm_models/llama-bot-chat.Q5_K_M.gguf --probes encoding
and I have set theGGML_MAIN_PATH=/code/llama.cpp/main
in my.zshrc
and sourced it.When I run the above command I get
subprocess.CalledProcessError: Command '['/code/llama.cpp/main', '-m', '/code/llm_models/llama-bot-chat.Q5_K_M.gguf', '-n', '150', '--repeat-penalty', '1.1', '--presence-penalty', '0.0', '--frequency-penalty', '0.0', '--top-k', '40', '--top-p', '0.95', '--temp', '0.8', '-s', 'None', '-p', "1,h(!0JY@l@V0(-1cRL,2)Ji#2_d9P3G(3\\@Q@sR2E!@'@l?>)ARdK-AMR[L@l-&!3+66)Ai;M$An3T-"]' returned non-zero exit status 1.
I'm unsure what I have done wrong, I tested the command on both Linux and Mac, and get the same result both times.
The text was updated successfully, but these errors were encountered: