Skip to content

Conversation

@katzdave
Copy link
Collaborator

@katzdave katzdave commented Oct 22, 2025

Summary

Fixes a critical bug where message visibility metadata was not properly synchronized between client and server after compaction, causing repeated compaction failures and context overflow issues.

The Problem

When the server performs conversation compaction:

  1. Server compacts 400 messages → sends UpdateConversation event with 10 messages (old ones marked agentVisible=false)
  2. Client updates React state ✓
  3. Bug: Local currentMessages variable still has 400 messages with agentVisible=true
  4. New messages get appended to the stale 400-message array
  5. React state gets overwritten with incorrect metadata
  6. Next request sends 400+ messages marked agentVisible=true
  7. Server tries to compact again but context is too large → compaction fails

This created a persistent mismatch between client and server state where the client thought all 400 messages were visible to the agent, while the server had correctly marked most as agentVisible=false.

The Fix

Synchronize the local currentMessages variable when UpdateConversation event is received in both streaming handlers:

  • useChatStream.ts: Added currentMessages = event.conversation to sync local variable with compacted state
  • useMessageStream.ts: Added currentMessages = parsedEvent.conversation to sync local variable with compacted state

This ensures that subsequent messages are appended to the compacted conversation with correct visibility metadata, not the stale 400-message array.

Testing

  • Type checks pass
  • ESLint passes
  • Prettier formatting applied

Impact

  • Before: Repeated compaction failures, context overflow errors
  • After: Proper state synchronization, compaction works reliably

This fixes a critical bug where message visibility metadata was not properly
synchronized between client and server after compaction, causing repeated
compaction failures.

The Problem:
When the server performs conversation compaction, it sends an UpdateConversation
event with the compacted messages where old messages are marked agentVisible=false.
However, the streaming handlers in both useChatStream.ts and useMessageStream.ts
were updating React state but NOT updating the local currentMessages variable
that accumulates subsequent message events.

This caused:
1. Server compacts 400 messages → sends 10 messages (old ones agentVisible=false)
2. Client updates React state ✓
3. Local currentMessages still has 400 messages with agentVisible=true ✗
4. New messages get appended to the stale 400-message array
5. React state gets overwritten with incorrect metadata
6. Next request sends 400+ messages marked agentVisible=true
7. Server tries to compact again but context is too large → fails

The Fix:
Synchronize the local currentMessages variable when UpdateConversation event
is received, ensuring subsequent messages are appended to the compacted
conversation with correct visibility metadata.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
@katzdave katzdave marked this pull request as draft October 22, 2025 05:01
@katzdave
Copy link
Collaborator Author

OK my agent was a little eager, needs some more testing.

@katzdave katzdave closed this Oct 22, 2025
@katzdave katzdave reopened this Oct 22, 2025
@katzdave katzdave marked this pull request as ready for review October 22, 2025 16:38
Copy link
Collaborator

@DOsinga DOsinga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unfortunate but true!

@katzdave katzdave merged commit 68811bf into main Oct 22, 2025
29 of 30 checks passed
@katzdave katzdave deleted the fix/compaction-visibility-sync branch October 22, 2025 17:09
katzdave added a commit that referenced this pull request Oct 22, 2025
* 'main' of github.com:block/goose:
  Fix gemini again (#5308)
  fix: synchronize local message state after conversation compaction (#5315)
  docs: replace broken links with working links (#5266)
  Add Web Accessibility Auditor recipe to cookbook (#5318)
  To do mcp tutorial (#5317)
  workflows: add a manual trigger option to pr-smoke-test (#5302)
  documenting `goose recipe list` command (#5278)
wpfleger96 added a commit that referenced this pull request Oct 22, 2025
* main:
  Replace compaction notifications with system notifications (#5218)
  Diagnostics (#5323)
  Fix gemini again (#5308)
  fix: synchronize local message state after conversation compaction (#5315)
  docs: replace broken links with working links (#5266)
michaelneale added a commit that referenced this pull request Oct 24, 2025
* main: (77 commits)
  Fix legacy import (#5343)
  Unify loading goose messages and usechatstream determines chat state (#5306)
  Docs: goose session export and goose session import (#5267)
  Create recipe dir on save (#5337)
  docs: Update Discord link (#5335)
  [recipe workflow]: Fix `Invalid revision range` error  (#5334)
  Add tech-article-explainer recipe (#5333)
  doc: added beta banner for old blog post (#5332)
  feat: add code refactor recipe (#5320)
  Security audit recipe (#5319)
  feat: add generate commit message recipe (#5326)
  fix: remove dependency on gsap library (#5330)
  feat: dynamically load ollama models (#5309)
  fix: skip temperature for goose-gpt-5 model (#5311)
  Replace compaction notifications with system notifications (#5218)
  Diagnostics (#5323)
  Fix gemini again (#5308)
  fix: synchronize local message state after conversation compaction (#5315)
  docs: replace broken links with working links (#5266)
  Add Web Accessibility Auditor recipe to cookbook (#5318)
  ...
BlairAllan pushed a commit to BlairAllan/goose that referenced this pull request Oct 25, 2025
BlairAllan pushed a commit to BlairAllan/goose that referenced this pull request Nov 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants