-
Notifications
You must be signed in to change notification settings - Fork 3.5k
feat(cli): message queueing #7632
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
EDIT: this is now resolved There is a current problem with the implementation. Any help would be appreciated. Currently when the message is dequeued and added to chat history, the chat history/gui does not show the sent message. Here is a video to understand to show the problem. problem.mp4 |
0504a36
to
4c7c295
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will be a great improvement to cn!
🎉 This PR is included in version 1.13.0 🎉 The release is available on: Your semantic-release bot 📦🚀 |
Description
When messages are sent during streaming, those will be queued internally and later processed by the LLM.
resolves CON-3768
AI Code Review
@continue-general-review
or@continue-detailed-review
Checklist
Screen recording or screenshot
[ When applicable, please include a short screen recording or screenshot - this makes it much easier for us as contributors to review and understand your changes. See this PR as a good example. ]
Tests
[ What tests were added or updated to ensure the changes work as expected? ]
Summary by cubic
Adds message queuing to the CLI chat: pressing Enter during streaming now queues your message and sends it automatically when the model finishes. This improves flow so you can keep typing. Addresses Linear CON-3768.