Skip to content

Ollama model num_ctx is being ignored in the newest version #7159

@v1lev

Description

@v1lev

App Version

v3.25.16

API Provider

Ollama

Model Used

Qwen3-Coder-30B-A3B

Roo Code Task Links (Optional)

I have created a model with num_ctx set to 16384. After the newest update of Roo Code it is being ignored and the model starts with

llama_context: n_ctx         = 1048576
llama_context: n_ctx_per_seq = 1048576

A model with such a large context does not fit into memory of my PC, so Roo Code simply stopped working properly.

Rolling back to 3.25.15 helped.

🔁 Steps to Reproduce

  1. OS Windows 11 Pro 24H2, Roo Code v3.25.16, Ollama version is 0.11.4
  2. Create a model from Qwen3-Coder-30B-A3B with num_ctx set to 16384
  3. Try to use Roo Code

💥 Outcome Summary

Expected that model will start with num_ctx of 16384, but it had 1048576 instead

📄 Relevant Logs or Errors (Optional)

Metadata

Metadata

Assignees

No one assigned

    Labels

    Issue - In ProgressSomeone is actively working on this. Should link to a PR soon.bugSomething isn't working

    Type

    No type

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions