-
Notifications
You must be signed in to change notification settings - Fork 2.4k
feat: stream LLM responses #2677
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| }, [headers, body]); | ||
|
|
||
| // TODO: not this? | ||
| const [, forceUpdate] = useReducer((x) => x + 1, 0); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This almost certainly isn't the best way to do this, but without it I could not get new messages to trigger a re-render. I could use a react expert's help here :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zanesq any ideas here? I want to re-render every time we add a chunk, and so far the only thing that made it work was adding this fake state to force it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jamadeo I think the best way is to use an immutable approach instead something like this. Instead of mutating in place, always create a new array/object when updating state. React should detect the change and re-render automatically.
Instead of
const lastMessage = currentMessages[currentMessages.length - 1];
lastMessage.content = [...lastMessage.content, ...newMessage.content];
forceUpdate();
try
currentMessages = [
...currentMessages.slice(0, -1),
{
...currentMessages[currentMessages.length - 1],
content: [
...currentMessages[currentMessages.length - 1].content,
...newMessage.content,
],
},
];
|
@jamadeo looks really great - but there is a bug where it seems to not call tools correctly |
c91784d to
0a9d86d
Compare
|
Thanks @baxen for the review! I actually effectively reverted the subagent change because it doesn't really help anything to use streaming in that context. If we, in the future, want to show partial text responses from models to subagents in a streamed way we can think about it then, but no reason to fit it in now. |
baxen
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Amazing!
|
@jamadeo oops thought it would be an easier merge than that to update it, but looking good. Sorry have slightly broken it (looking at it) - feel free to yank that last merge if it is all broken and not fixed by your morning time. |
* main: (23 commits) docs: VS Code MCP video (#3307) docs: fixed broken link (#3306) Add YouTube video to Netlify MCP documentation (#3302) docs: add sub-recipes topic (#3241) docs: move topics to tutorials section (#3297) site analytics (#3293) chore(release): release version 1.0.35 (#3292) docs: enhanced code editing topic (#3287) fix cu (#3291) feat: Add environment variables to override model context limits (#3260) chore(release): release version 1.0.34 (#3285) fix(devcontainer): install protoc to fix build (#3267) Enabling npx command to install on Windows Desktop (#3283) Fix: Allow native Cmd+Up/Down cursor movement when user has typed text (#3246) chore(release): release version 1.0.33 (#3284) fix Windows Env Vars (#3282) feat: bedrock image content support (#3266) Add support in goose configure for streaming http mcp tools (#3256) docs: add Alby MCP tutorial (#3217) refactor(tests): make logging test in goose-cli less flaky on macos (#3273) ...
* main: docs: recipe parameters in desktop (#3326) switch to custom runner for rust build (#3325) fix the npx.cmd mapping issue (#3324) Structured output in Goose CLI and Goose Desktop (#3320) docs: add managing tools section and tool-router topic (#3310) docs: Remove Deeplink for Filesystem MCP Server (#3314)
|
have been testing this with CLI + databricks and seems ✅ |
Co-authored-by: Michael Neale <[email protected]> Signed-off-by: Adam Tarantino <[email protected]>
* main: fix: convert invalid recipe variable name to raw content (#3420) center goose mobile screenshot (#3418) docs: model context limit overrides (#3377) docs: Subagents (#3402) fix: avoid pass encoded empty string to goose run --recipe (#3361) ux: alphabetize extensions (#3416) fix: message concatenation in server session management (#3412) refactor: streamline memory directory management (#3345) feat: Add AZURE_OPENAI_API_KEY as a visible config parameter (#3265) feat: stream LLM responses (#2677) fix checkout for non mac builds (#3408) Docs: Voice dictation in Goose Desktop (#3376) docs: cli theme persistence (#3398) docs: goose mobile (#3403)
* 'main' of github.com:block/goose: fix: Set include_usage=true for OpenAI streaming (#3441) feat: `recipe list` (#2814) (#2815) docs: update github mcp config (#3433) feat: Implement streaming for OpenAI (#3413) fix: improve extension startup error messages with command details (#2694) [feat]: improve file search tools to add globsearch / grep tools (#3368) docs: typo in guide description (#3429) fix: use safe_truncate to truncate charactor (#3263) (#3264) fix: convert invalid recipe variable name to raw content (#3420) center goose mobile screenshot (#3418) docs: model context limit overrides (#3377) docs: Subagents (#3402) fix: avoid pass encoded empty string to goose run --recipe (#3361) ux: alphabetize extensions (#3416) fix: message concatenation in server session management (#3412) refactor: streamline memory directory management (#3345) feat: Add AZURE_OPENAI_API_KEY as a visible config parameter (#3265) feat: stream LLM responses (#2677) # Conflicts: # crates/goose/src/session/storage.rs # ui/desktop/src/components/ChatView.tsx # ui/desktop/src/components/settings/extensions/subcomponents/ExtensionList.tsx
Co-authored-by: Michael Neale <[email protected]> Signed-off-by: Soroosh <[email protected]>
Co-authored-by: Michael Neale <[email protected]> Signed-off-by: Kyle Santiago <[email protected]>
Co-authored-by: Michael Neale <[email protected]>


Only databricks provider to start, but should not be hard to do this for others that support streaming.
Desktop:
streaming-gui.mov
CLI:
cli.mov