Skip to content

Conversation

@Jovonni
Copy link
Contributor

@Jovonni Jovonni commented Aug 23, 2025

I'm not entirely sure if this is the right way to fix this issue, but here's what I ran into when using the goose desktop app installed from the site (v1.6.0):

Screenshot 2025-08-22 at 10 50 09 PM

implements streaming capability for ollama chat completions by:

  • adding stream method to ollamaprovider
  • using openai-compatible streaming format with stream=true
  • handling streamed responses line by line with proper error handling
  • including usage metrics in streamed responses
  • maintaining compatibility with existing chat/auto modes

note

@got-root
Copy link

Looks like this may fix this one: #4308

Copy link
Collaborator

@DOsinga DOsinga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks!

let config = crate::config::Config::global();
let goose_mode = config.get_param("GOOSE_MODE").unwrap_or("auto".to_string());
let filtered_tools = if goose_mode == "chat" { &[] } else { tools };

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this seems unexpected. by the time we get here, we should have already have filtered the tools. do we really need this?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah I don't think so, not sure where that came from

@michaelneale
Copy link
Collaborator

michaelneale commented Aug 27, 2025

thanks @Jovonni - that works once I patched it up: #4364 - if you want, you can pull that change in to this branch and I can approve/merge, or we can shift over to there, has your commits.

can you also sign the DCO: https://github.com/block/goose/pull/4303/checks?check_run_id=48963447829 so we can get past that

you can get cherry-pick b70930d86fd8b0c141fbf4bfcbd08c4e6de5b73a and and git cherry-pick ba63988c6a3a1844e359dd58119271d8dbe4a78a (or something like that) and push here - and should be good to go once you sign the DCO.

Nice first contrib, this feels a lot nicer with ollama now it can stream.

@Jovonni
Copy link
Contributor Author

Jovonni commented Aug 27, 2025

just signed dco @michaelneale

e9e8a46

sorry for the delay, have been heads down on my side.

@Jovonni
Copy link
Contributor Author

Jovonni commented Aug 27, 2025

@michaelneale I also just cherry picked your work (from #4364) as you mentioned above. Glad to hear this was useful!! 🚀 🫡

yield (message, usage);
}
}))
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

left this comment in the other PR too - this looks very similar to the openai implementation. can we extract this and reuse?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@DOsinga yeah good idea - I am not familiar with the pattern on how to reuse parts of another provider, can ollama just be openai provider with some customisation? what is another examplar to follow (this is an older provider so may have some jank/special cases)

@michaelneale
Copy link
Collaborator

@Jovonni if you get a chance to consider the comment above - if there is a way we can re-use the openai streaming would be nice, I may get a chance later today or tomorrow to look, but if not, we can probably merge this soon

@michaelneale michaelneale merged commit 3fda5c2 into block:main Aug 28, 2025
10 checks passed
michaelneale added a commit that referenced this pull request Aug 29, 2025
* main:
  new recipe to lint-check my code (#4416)
  removing a leftover syntax error (#4415)
  Iand/updating recipe validation workflow (#4413)
  Iand/updating recipe validation workflow (#4410)
  Fix (Ollama provider): Unsupported operation: streaming not implemented (#4303)
  change databricks default to claude sonnet 4 (#4405)
  Iand/updating recipe validation workflow (#4406)
  Add metrics for recipe metadata in scheduler, UI, and CLI (#4399)
  Iand/updating recipe validation workflow (#4403)
  making small updates to recipe validation workflow (#4401)
  Automate OpenRouter API Key Distribution for External Recipe Contributors (#3198)
  Enhance `convert_path_with_tilde_expansion` to handle Windows (#4390)
  make sure all cookbook recipes have a title and version, but no id (#4395)
  Nest TODO State in session data (#4361)
  Fast model falls back to regular (#4375)
  Update windows instructions (#4333)
lifeizhou-ap added a commit that referenced this pull request Aug 29, 2025
* main: (40 commits)
  new recipe to lint-check my code (#4416)
  removing a leftover syntax error (#4415)
  Iand/updating recipe validation workflow (#4413)
  Iand/updating recipe validation workflow (#4410)
  Fix (Ollama provider): Unsupported operation: streaming not implemented (#4303)
  change databricks default to claude sonnet 4 (#4405)
  Iand/updating recipe validation workflow (#4406)
  Add metrics for recipe metadata in scheduler, UI, and CLI (#4399)
  Iand/updating recipe validation workflow (#4403)
  making small updates to recipe validation workflow (#4401)
  Automate OpenRouter API Key Distribution for External Recipe Contributors (#3198)
  Enhance `convert_path_with_tilde_expansion` to handle Windows (#4390)
  make sure all cookbook recipes have a title and version, but no id (#4395)
  Nest TODO State in session data (#4361)
  Fast model falls back to regular (#4375)
  Update windows instructions (#4333)
  feat: linux computer control for android (termux) (#3890)
  feat: Added scroll state support for chat-session-list navigation (#4360)
  docs: typo fix (#4376)
  blog: goose janitor (#4131)
  ...
wpfleger96 added a commit to wpfleger96/goose that referenced this pull request Aug 29, 2025
…tensions-on-resume

* upstream/main: (60 commits)
  [cookbook recipe] Update Wording  (block#4438)
  feat: show enabled extensions at top of extensions page (block#4423)
  test recipe (block#4436)
  Extensions loading indicator on desktop launch (block#4412)
  removing trailing slash (block#4433)
  [recipe cookbook] test recipe (block#4431)
  [recipe cookbook] switching to SHA (block#4429)
  [recipe cookbook] Update url build (block#4427)
  [Recipe Cookbook] test recipe flow (block#4426)
  [Recipe cookbook] Addressing GitHub api format issue (block#4424)
  feat: integrate tool call icons with status indicators and daisy chaining (block#4279)
  new recipe to lint-check my code (block#4416)
  removing a leftover syntax error (block#4415)
  Iand/updating recipe validation workflow (block#4413)
  Iand/updating recipe validation workflow (block#4410)
  Fix (Ollama provider): Unsupported operation: streaming not implemented (block#4303)
  change databricks default to claude sonnet 4 (block#4405)
  Iand/updating recipe validation workflow (block#4406)
  Add metrics for recipe metadata in scheduler, UI, and CLI (block#4399)
  Iand/updating recipe validation workflow (block#4403)
  ...
thisispete pushed a commit that referenced this pull request Aug 30, 2025
This was referenced Sep 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Feature Request: Implement Streaming for Ollama Models

4 participants