forked from cline/cline
-
Notifications
You must be signed in to change notification settings - Fork 2.4k
Closed as duplicate of#7797
Closed as duplicate of#7797
Copy link
Labels
Issue - In ProgressSomeone is actively working on this. Should link to a PR soon.Someone is actively working on this. Should link to a PR soon.bugSomething isn't workingSomething isn't working
Description
App Version
v3.25.16
API Provider
Ollama
Model Used
Qwen3-Coder-30B-A3B
Roo Code Task Links (Optional)
I have created a model with num_ctx set to 16384. After the newest update of Roo Code it is being ignored and the model starts with
llama_context: n_ctx = 1048576
llama_context: n_ctx_per_seq = 1048576
A model with such a large context does not fit into memory of my PC, so Roo Code simply stopped working properly.
Rolling back to 3.25.15 helped.
🔁 Steps to Reproduce
- OS Windows 11 Pro 24H2, Roo Code v3.25.16, Ollama version is 0.11.4
- Create a model from Qwen3-Coder-30B-A3B with num_ctx set to 16384
- Try to use Roo Code
💥 Outcome Summary
Expected that model will start with num_ctx of 16384, but it had 1048576 instead
📄 Relevant Logs or Errors (Optional)
yevgen-fedorenko, drofnas, Waltibaba, GJoe2, paulgear and 3 more
Metadata
Metadata
Assignees
Labels
Issue - In ProgressSomeone is actively working on this. Should link to a PR soon.Someone is actively working on this. Should link to a PR soon.bugSomething isn't workingSomething isn't working
Type
Projects
Status
Done