Skip to content

Conversation

uinstinct
Copy link
Contributor

@uinstinct uinstinct commented Sep 8, 2025

Description

When messages are sent during streaming, those will be queued internally and later processed by the LLM.

resolves CON-3768

AI Code Review

  • Team members only: AI review runs automatically when PR is opened or marked ready for review
  • Team members can also trigger a review by commenting @continue-general-review or @continue-detailed-review

Checklist

  • [] I've read the contributing guide
  • [] The relevant docs, if any, have been updated or created
  • [] The relevant tests, if any, have been updated or created

Screen recording or screenshot

[ When applicable, please include a short screen recording or screenshot - this makes it much easier for us as contributors to review and understand your changes. See this PR as a good example. ]

Tests

[ What tests were added or updated to ensure the changes work as expected? ]


Summary by cubic

Adds message queuing to the CLI chat: pressing Enter during streaming now queues your message and sends it automatically when the model finishes. This improves flow so you can keep typing. Addresses Linear CON-3768.

  • New Features
    • Introduced a FIFO MessageQueue with timestamps and debug logs; supports optional image attachments.
    • UserInput now enqueues submissions during streaming instead of blocking Enter; input is cleared immediately.
    • useChat processes one queued message after each response (with a brief delay to let UI update) and updates input history when processed.

@uinstinct uinstinct requested a review from a team as a code owner September 8, 2025 16:33
@uinstinct uinstinct requested review from Patrick-Erichsen and removed request for a team September 8, 2025 16:33
@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Sep 8, 2025
@uinstinct
Copy link
Contributor Author

uinstinct commented Sep 8, 2025

EDIT: this is now resolved

There is a current problem with the implementation. Any help would be appreciated.

Currently when the message is dequeued and added to chat history, the chat history/gui does not show the sent message. Here is a video to understand to show the problem.

problem.mp4

@tingwai tingwai force-pushed the cli-message-queueing branch from 0504a36 to 4c7c295 Compare September 9, 2025 23:26
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. and removed size:M This PR changes 30-99 lines, ignoring generated files. labels Sep 9, 2025
@uinstinct uinstinct changed the title feat(cli): message queing feat(cli): message queueing Sep 10, 2025
Copy link
Collaborator

@tingwai tingwai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will be a great improvement to cn!

@github-project-automation github-project-automation bot moved this from Todo to In Progress in Issues and PRs Sep 11, 2025
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Sep 11, 2025
@tingwai tingwai merged commit f0a7f7f into continuedev:main Sep 11, 2025
67 of 70 checks passed
@github-project-automation github-project-automation bot moved this from In Progress to Done in Issues and PRs Sep 11, 2025
@github-actions github-actions bot locked and limited conversation to collaborators Sep 11, 2025
@sestinj
Copy link
Contributor

sestinj commented Sep 11, 2025

🎉 This PR is included in version 1.13.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

@uinstinct uinstinct deleted the cli-message-queueing branch September 12, 2025 02:45
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
lgtm This PR has been approved by a maintainer released size:L This PR changes 100-499 lines, ignoring generated files.
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

3 participants