-
Notifications
You must be signed in to change notification settings - Fork 11.6k
Issues: ggml-org/llama.cpp
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Eval bug: DeepSeek-R1-UD-Q2_K_XL output broken
bug-unconfirmed
#13305
opened May 4, 2025 by
joesixpaq
Misc. bug: Compilation with openCL on latest build
bug-unconfirmed
#13300
opened May 4, 2025 by
DarkSorrow
Misc. bug: -TS doesn't support more than ? Devices
bug-unconfirmed
#13293
opened May 4, 2025 by
justinjja
Compile bug: paths with spaces fail on Unix with Vulkan backend
bug-unconfirmed
#13288
opened May 3, 2025 by
kangalio
Misc. bug: Completions hang after CUDA error, but health endpoint reports all OK
bug-unconfirmed
#13281
opened May 3, 2025 by
lee-b
Misc. bug: llama-server webui overriding command line parameters
bug-unconfirmed
#13277
opened May 3, 2025 by
merc4derp
Feature Request: Granite 4 Support
enhancement
New feature or request
#13275
opened May 2, 2025 by
gabe-l-hart
5 of 16 tasks
Feature Request: add to llama-bench device info reporting of "bf16:1", if built with VK_KHR_bfloat16 support and driver also supports it..
enhancement
New feature or request
#13274
opened May 2, 2025 by
oscarbg
4 tasks done
Feature Request: add per-request "reasoning" options in llama-server
enhancement
New feature or request
#13272
opened May 2, 2025 by
ngxson
Compile bug: nvcc fatal : Unsupported gpu architecture 'compute_120'
bug-unconfirmed
#13271
opened May 2, 2025 by
jacekpoplawski
Misc. bug: Server does not always cancel requests for disconnected connections
bug-unconfirmed
#13262
opened May 2, 2025 by
CyberShadow
Misc. bug: the output file of llama-quantize is not gguf format
bug-unconfirmed
#13258
opened May 2, 2025 by
samsosu
Eval bug: sentencepiece tokenizer generates incorrect tokens
bug-unconfirmed
#13256
opened May 2, 2025 by
taylorchu
Misc. bug: terminate called after throwing an instance of 'vk::DeviceLostError'
bug-unconfirmed
#13248
opened May 1, 2025 by
Som-anon
Feature Request: Support multimodal LLMs such as Qwen2.5-VL as embedding models
enhancement
New feature or request
#13247
opened May 1, 2025 by
cebtenzzre
4 tasks done
Feature Request: s390x CI
enhancement
New feature or request
#13243
opened May 1, 2025 by
taronaeo
4 tasks done
Feature Request: Allow disabling New feature or request
offload_op
for backends by user
enhancement
#13241
opened May 1, 2025 by
hjc4869
4 tasks done
Eval bug: -sm row causes GGML_ASSERT fail in Llama 4 Scout
bug-unconfirmed
#13240
opened May 1, 2025 by
FullstackSensei
Feature Request: XiaomiMiMo/MiMo-7B-RL
enhancement
New feature or request
#13218
opened Apr 30, 2025 by
Superluckyhu
4 tasks done
Previous Next
ProTip!
Adding no:label will show everything without a label.