Skip to content

Conversation

@blackgirlbytes
Copy link
Contributor

@blackgirlbytes blackgirlbytes commented Dec 15, 2025

Summary

Fixes #6117

When using chat mode with Ollama models that don't support tool calling (e.g., deepseek-coder:6.7b), users were getting a 400 error:

Request failed: Bad request (400): registry.ollama.ai/library/deepseek-coder:6.7b does not support tools.

Root Cause

The stream() function in the Ollama provider was not checking GooseMode::Chat and always passed tools to the API. The complete_with_model() function already had this check, but since streaming is the default behavior, the bug affected most users.

Changes

Added the same GooseMode::Chat filtering logic to the stream() function that already exists in complete_with_model():

let config = crate::config::Config::global();
let goose_mode = config.get_goose_mode().unwrap_or(GooseMode::Auto);
let filtered_tools = if goose_mode == GooseMode::Chat {
    &[]
} else {
    tools
};

Testing

  • Verified the fix compiles successfully
  • Verified clippy passes with no warnings
  • Manual testing with deepseek-coder:6.7b in chat mode should now work without errors

Did I use AI?

Yes, I did. While I was looking into the problem, goose wanted to fix it, and the fix seemed plausible.

The stream() function was not checking GooseMode::Chat and always
passed tools to the API, causing 400 errors with models that don't
support tool calling (e.g., deepseek-coder:6.7b).

This aligns the streaming behavior with complete_with_model() which
already correctly filters tools in chat mode.

Fixes #6117
Copilot AI review requested due to automatic review settings December 15, 2025 15:20
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR fixes a bug where Ollama models without tool calling support (e.g., deepseek-coder:6.7b) were receiving 400 errors in chat mode. The fix applies the existing tool filtering logic from complete_with_model() to the stream() method, ensuring both code paths respect GooseMode::Chat.

  • Adds GooseMode::Chat filtering to the stream() function
  • Mirrors the existing pattern in complete_with_model() for consistency

Copy link
Collaborator

@michaelneale michaelneale left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fantastic catch @blackgirlbytes - I bet that has been hiding in there for ages

@angiejones angiejones merged commit 8b80d8b into main Dec 16, 2025
23 checks passed
@angiejones angiejones deleted the fix/ollama-streaming-chat-mode-tools branch December 16, 2025 02:10
fbalicchia pushed a commit to fbalicchia/goose that referenced this pull request Dec 16, 2025
aharvard added a commit that referenced this pull request Dec 16, 2025
…erer

* origin/main: (26 commits)
  Don't persist ephemeral extensions when resuming sessions (#5974)
  chore(deps): bump mdast-util-to-hast from 13.2.0 to 13.2.1 in /ui/desktop (#5939)
  chore(deps): bump node-forge from 1.3.1 to 1.3.2 in /documentation (#5898)
  Add Scorecard supply-chain security workflow (#5810)
  Don't show subagent tool when we're a subagent (#6125)
  Fix keyboard shortcut conflict for Focus Goose Window (#5809)
  feat(goose-cli): add feature to disable update (#5886)
  workflow: enable docs-update-recipe-ref (#6132)
  fix: filter tools in Ollama streaming when chat mode is enabled (#6118)
  feat(mcp): platform extension for "code mode" MCP tool calling (#6030)
  workflow: auto-update recipe-reference on release (#5988)
  Document recipe slash commands feature (#6075)
  docs: add GitHub Copilot device flow authentication details (#6123)
  Disallow subagents with no extensions (#5825)
  chore(deps): bump js-yaml in /documentation (#6093)
  feat: external goosed server (#5978)
  fix: Make datetime info message more explicit to prevent LLM confusion about current year (#6101)
  refactor: unify subagent and subrecipe tools into single tool (#5893)
  goose repo is too big for the issue solver workflow worker (#6099)
  fix: use system not developer role in db (#6098)
  ...
zanesq added a commit that referenced this pull request Dec 16, 2025
* 'main' of github.com:block/goose: (22 commits)
  OpenRouter & Xai streaming (#5873)
  fix: resolve mcp-hermit cleanup path expansion issue (#5953)
  feat: add goose PR reviewer workflow (#6124)
  perf: Avoid repeated MCP queries during streaming responses (#6138)
  Fix YAML serialization for recipes with special characters (#5796)
  Add more posthog analytics (privacy aware) (#6122)
  docs: add Sugar MCP server to extensions registry (#6077)
  Fix tokenState loading on new sessions (#6129)
  bump bedrock dep versions (#6090)
  Don't persist ephemeral extensions when resuming sessions (#5974)
  chore(deps): bump mdast-util-to-hast from 13.2.0 to 13.2.1 in /ui/desktop (#5939)
  chore(deps): bump node-forge from 1.3.1 to 1.3.2 in /documentation (#5898)
  Add Scorecard supply-chain security workflow (#5810)
  Don't show subagent tool when we're a subagent (#6125)
  Fix keyboard shortcut conflict for Focus Goose Window (#5809)
  feat(goose-cli): add feature to disable update (#5886)
  workflow: enable docs-update-recipe-ref (#6132)
  fix: filter tools in Ollama streaming when chat mode is enabled (#6118)
  feat(mcp): platform extension for "code mode" MCP tool calling (#6030)
  workflow: auto-update recipe-reference on release (#5988)
  ...

# Conflicts:
#	ui/desktop/src/App.tsx
#	ui/desktop/src/api/sdk.gen.ts
#	ui/desktop/src/components/ChatInput.tsx
#	ui/desktop/src/components/recipes/RecipesView.tsx
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Ollama streaming ignores chat mode - still sends tools to models that don't support them

4 participants