-
Notifications
You must be signed in to change notification settings - Fork 1.7k
feat: add more context to chat #2886
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@Rish-it is attempting to deploy a commit to the Onlook Team on Vercel. A member of the Team first needs to authorize it. |
| const savedInput = inputValue.trim(); | ||
| setInputValue(''); | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider moving the setInputValue('') call inside the try block after the message has been successfully sent or queued. The current implementation clears the input before confirming the operation succeeded, which means if an error occurs during sending or queueing, the user's input will be lost. This creates a poor user experience if the operation fails.
| const savedInput = inputValue.trim(); | |
| setInputValue(''); | |
| const savedInput = inputValue.trim(); | |
| // setInputValue('') will be moved inside the try block after successful send | |
Spotted by Diamond
Is this helpful? React 👍 or 👎 to let us know.
|
Note Other AI code review bot(s) detectedCodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review. WalkthroughAdds a message queue to the chat system and integrates it with ChatManager. Updates chat input to allow typing during streaming and to enqueue messages when streaming. Adjusts use-chat hook to process queued messages after streaming and simplifies context cleanup. Introduces a new queue module with enqueue/dequeue and processing. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor User
participant ChatInput
participant UseChat
participant ChatManager as ChatManager
participant MessageQueue as MessageQueue
participant Engine as EditorEngine
User->>ChatInput: Type + Press Send
alt Streaming active
ChatInput->>ChatManager: chat.queue.enqueue(content, mode)
ChatInput-->>User: Clear input, show queued count
else Not streaming
ChatInput->>ChatManager: chat.sendMessage(content, mode)
end
Note over UseChat,MessageQueue: After streaming completes
UseChat->>MessageQueue: processNext() (delayed)
loop Until queue empty or streaming resumes
MessageQueue->>ChatManager: sendMessage(next.content, next.type)
ChatManager->>Engine: Dispatch to AI
Engine-->>ChatManager: Response streaming
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✨ Finishing touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 6
🧹 Nitpick comments (6)
apps/web/client/src/components/store/editor/chat/queue.ts (1)
36-46: Guard against enqueueing empty/whitespace messages.Today the callsite filters, but this API should be safe by itself.
- enqueue(content: string, type: ChatType): QueuedMessage { + enqueue(content: string, type: ChatType): QueuedMessage { + const trimmed = content.trim(); + if (!trimmed) { + throw new Error('Cannot enqueue empty message'); + } const message: QueuedMessage = { - content: content.trim(), + content: trimmed, type, - id: `queued_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`, + id: `queued_${Date.now()}_${Math.random().toString(36).slice(2, 11)}`, timestamp: Date.now(), };apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx (2)
219-223: Process the queue immediately; avoid arbitrary 500ms delay.The queue already guards on isStreaming. Fire now and optionally clean up if this effect re-runs.
- // Process any queued messages - setTimeout(() => { - void editorEngine.chat.queue.processNext(); - }, 500); + // Process any queued messages + void editorEngine.chat.queue.processNext();Optionally add a cleanup if you keep a timer.
224-225: Trim effect dependencies.Including stable objects (queue/context/versions) adds noise without benefit here. Depend on finishReason and ids.
- }, [finishReason, conversationId, editorEngine.chat.queue, editorEngine.chat.context, editorEngine.versions, setMessages]); + }, [finishReason, conversationId, setMessages]);apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx (3)
96-98: Hard-disable flag no longer needed as a variable.Since it’s always false, inline it and remove the prop plumbing to reduce noise.
- const disabled = false; // Allow input while streaming + // Input stays enabled while streamingAnd pass disabled={false} or drop the prop where it defaults to enabled.
143-151: Don’t log user message content.Console logging queued messages may expose sensitive text in shared environments.
- const queuedMessage = editorEngine.chat.queue.enqueue(savedInput, chatMode); - console.log('Message queued:', queuedMessage); + editorEngine.chat.queue.enqueue(savedInput, chatMode);Also set the input to '' after success:
} else { // Send immediately if not streaming await onSendMessage(savedInput, chatMode); } + setInputValue('');
436-441: Remove unreachable isStreaming branches in the “Send” button.This block only renders when !isStreaming.
- disabled={inputEmpty} + disabled={inputEmpty} onClick={() => void sendMessage()} - title={isStreaming ? 'Queue message' : 'Send message'} + title="Send message" > - {isStreaming ? <Icons.CounterClockwiseClock /> : <Icons.ArrowRight />} + <Icons.ArrowRight />Consider localizing the title as well.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx(4 hunks)apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx(1 hunks)apps/web/client/src/components/store/editor/chat/index.ts(3 hunks)apps/web/client/src/components/store/editor/chat/queue.ts(1 hunks)
🧰 Additional context used
📓 Path-based instructions (6)
apps/web/client/src/app/**/*.tsx
📄 CodeRabbit inference engine (AGENTS.md)
apps/web/client/src/app/**/*.tsx: Default to Server Components; add 'use client' when using events, state/effects, browser APIs, or client‑only libraries
Do not use process.env in client code; import env from @/env insteadAvoid hardcoded user-facing text; use next-intl messages/hooks
Files:
apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsxapps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
apps/web/client/src/**/*.{ts,tsx}
📄 CodeRabbit inference engine (AGENTS.md)
apps/web/client/src/**/*.{ts,tsx}: Use path aliases @/* and ~/* for imports that map to apps/web/client/src/*
Avoid hardcoded user-facing text; use next-intl messages/hooks insteadUse path aliases @/* and ~/* for imports mapping to src/*
Files:
apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsxapps/web/client/src/components/store/editor/chat/queue.tsapps/web/client/src/components/store/editor/chat/index.tsapps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
apps/web/client/src/**/*.tsx
📄 CodeRabbit inference engine (AGENTS.md)
apps/web/client/src/**/*.tsx: Create MobX store instances with useState(() => new Store()) for stable references across renders
Keep the active MobX store in a useRef and perform async cleanup with setTimeout(() => storeRef.current?.clear(), 0) to avoid route-change races
Avoid useMemo for creating MobX store instances
Avoid putting the MobX store instance in effect dependency arrays if it causes loops; split concerns by domain
apps/web/client/src/**/*.tsx: Create MobX store instances with useState(() => new Store()) for stable identities across renders
Keep the active MobX store in a useRef and clean up asynchronously with setTimeout(() => storeRef.current?.clear(), 0)
Do not use useMemo to create MobX stores
Avoid placing MobX store instances in effect dependency arrays if it causes loops; split concerns instead
observer components must be client components; place a single client boundary at the feature entry; child observers need not repeat 'use client'
Files:
apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsxapps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
**/*.{ts,tsx}
📄 CodeRabbit inference engine (AGENTS.md)
Do not use the any type unless necessary
Files:
apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsxapps/web/client/src/components/store/editor/chat/queue.tsapps/web/client/src/components/store/editor/chat/index.tsapps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
apps/web/client/src/app/**/*.{ts,tsx}
📄 CodeRabbit inference engine (CLAUDE.md)
Default to Server Components; add 'use client' only when using events, state/effects, browser APIs, or client-only libs
Files:
apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsxapps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
{apps,packages}/**/*.{ts,tsx}
📄 CodeRabbit inference engine (CLAUDE.md)
Avoid using the any type unless absolutely necessary
Files:
apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsxapps/web/client/src/components/store/editor/chat/queue.tsapps/web/client/src/components/store/editor/chat/index.tsapps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
🧬 Code graph analysis (3)
apps/web/client/src/components/store/editor/chat/queue.ts (1)
apps/web/client/src/components/store/editor/engine.ts (1)
EditorEngine(33-138)
apps/web/client/src/components/store/editor/chat/index.ts (1)
apps/web/client/src/components/store/editor/chat/queue.ts (1)
MessageQueue(12-92)
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx (2)
apps/web/client/src/components/store/editor/chat/index.ts (1)
sendMessage(46-52)packages/ui/src/components/icons/index.tsx (1)
Icons(138-3667)
🔇 Additional comments (4)
apps/web/client/src/components/store/editor/chat/index.ts (2)
22-23: Queue init in ChatManager constructor looks good.
54-58: Good: clearing queue with other chat state.apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx (1)
140-142: Defer clearing the input until success.Avoid losing the text if queue/send fails. This mirrors prior feedback.
- const savedInput = inputValue.trim(); - setInputValue(''); + const savedInput = inputValue.trim();Move setInputValue('') into the success path (after enqueue/send).
apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx (1)
211-213: clearImagesFromContext is synchronous — no await required.
Definition at apps/web/client/src/components/store/editor/chat/context.ts (lines 311–313) is non-async (filters this.context); the call in apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx (lines 210–213) is fine.
| const getPlaceholderText = () => { | ||
| if (isStreaming && editorEngine.chat.queue.length > 0) { | ||
| return `${editorEngine.chat.queue.length} message${editorEngine.chat.queue.length > 1 ? 's' : ''} queued - type to add more...`; | ||
| } | ||
| if (isStreaming) { | ||
| return 'Type to queue message while AI responds...'; | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Localize new placeholders (next-intl).
Avoid hardcoded user-facing strings; add keys with pluralization.
- if (isStreaming && editorEngine.chat.queue.length > 0) {
- return `${editorEngine.chat.queue.length} message${editorEngine.chat.queue.length > 1 ? 's' : ''} queued - type to add more...`;
- }
- if (isStreaming) {
- return 'Type to queue message while AI responds...';
- }
+ if (isStreaming && editorEngine.chat.queue.length > 0) {
+ return t(transKeys.editor.chat.input.placeholder.streamingQueued, {
+ count: editorEngine.chat.queue.length,
+ });
+ }
+ if (isStreaming) {
+ return t(transKeys.editor.chat.input.placeholder.streaming);
+ }Please add the corresponding keys to your messages file.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const getPlaceholderText = () => { | |
| if (isStreaming && editorEngine.chat.queue.length > 0) { | |
| return `${editorEngine.chat.queue.length} message${editorEngine.chat.queue.length > 1 ? 's' : ''} queued - type to add more...`; | |
| } | |
| if (isStreaming) { | |
| return 'Type to queue message while AI responds...'; | |
| } | |
| const getPlaceholderText = () => { | |
| if (isStreaming && editorEngine.chat.queue.length > 0) { | |
| return t(transKeys.editor.chat.input.placeholder.streamingQueued, { | |
| count: editorEngine.chat.queue.length, | |
| }); | |
| } | |
| if (isStreaming) { | |
| return t(transKeys.editor.chat.input.placeholder.streaming); | |
| } |
🤖 Prompt for AI Agents
In
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
around lines 159 to 165, the placeholder strings are hardcoded; replace them
with next-intl message lookups (use intl.formatMessage or <FormattedMessage>)
and add pluralization for the queued count (e.g., a plural key like
"chat.input.queued" with forms for 0/one/other). Add corresponding keys to the
project messages file(s) (including plural variants and the "typing while
responding" key) and update the component to use those keys instead of literal
text so translations can be provided.
| import type { EditorEngine } from '../engine'; | ||
| import { ChatContext } from './context'; | ||
| import { ConversationManager } from './conversation'; | ||
| import { MessageQueue } from './queue'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Use path alias for internal imports.
Align with configured @/* aliases.
-import { MessageQueue } from './queue';
+import { MessageQueue } from '@/components/store/editor/chat/queue';📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| import { MessageQueue } from './queue'; | |
| import { MessageQueue } from '@/components/store/editor/chat/queue'; |
🤖 Prompt for AI Agents
In apps/web/client/src/components/store/editor/chat/index.ts around line 7, the
import uses a relative path "import { MessageQueue } from './queue';" but
project prefers the configured @/* path aliases; update the import to use the
alias (e.g. import { MessageQueue } from '@/components/store/editor/chat/queue')
matching your tsconfig/webpack alias and adjust the path to the correct aliased
location so linting/building uses the internal @/ alias.
| @@ -0,0 +1,92 @@ | |||
| import { type ChatType } from '@onlook/models'; | |||
| import { makeAutoObservable } from 'mobx'; | |||
| import type { EditorEngine } from '../engine'; | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Use path alias for internal imports.
Replace relative import with the configured alias per guidelines.
-import type { EditorEngine } from '../engine';
+import type { EditorEngine } from '@/components/store/editor/engine';📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| import type { EditorEngine } from '../engine'; | |
| import type { EditorEngine } from '@/components/store/editor/engine'; |
🤖 Prompt for AI Agents
In apps/web/client/src/components/store/editor/chat/queue.ts around line 3,
replace the relative import "import type { EditorEngine } from '../engine';"
with the project's configured path alias (e.g. use the alias configured in
tsconfig like "@/components/store/editor/engine" or the appropriate alias for
this repo) so internal imports follow the guidelines; update the import path to
the aliased module and ensure TypeScript resolves the alias by using the exact
alias name used in the repo.
| constructor(private editorEngine: EditorEngine) { | ||
| makeAutoObservable(this); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don’t make EditorEngine observable.
Avoid deep observability/memory churn by excluding heavy refs from MobX.
- constructor(private editorEngine: EditorEngine) {
- makeAutoObservable(this);
- }
+ constructor(private editorEngine: EditorEngine) {
+ makeAutoObservable(this, { editorEngine: false });
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| constructor(private editorEngine: EditorEngine) { | |
| makeAutoObservable(this); | |
| } | |
| constructor(private editorEngine: EditorEngine) { | |
| makeAutoObservable(this, { editorEngine: false }); | |
| } |
🤖 Prompt for AI Agents
In apps/web/client/src/components/store/editor/chat/queue.ts around lines 16 to
18, the constructor currently calls makeAutoObservable(this) which makes the
heavy EditorEngine reference fully observable; change the call to exclude that
property from observability by calling makeAutoObservable(this, { editorEngine:
false }) (or the equivalent makeObservable wiring) so EditorEngine remains a
plain non-observable field and avoid deep observability/memory churn.
| id: `queued_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`, | ||
| timestamp: Date.now(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Replace deprecated substr().
Use slice() to avoid deprecation warnings.
- id: `queued_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`,
+ id: `queued_${Date.now()}_${Math.random().toString(36).slice(2, 11)}`,📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| id: `queued_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`, | |
| timestamp: Date.now(), | |
| id: `queued_${Date.now()}_${Math.random().toString(36).slice(2, 11)}`, | |
| timestamp: Date.now(), |
🤖 Prompt for AI Agents
In apps/web/client/src/components/store/editor/chat/queue.ts around lines 40-41,
the id generation uses the deprecated Math.random().toString(36).substr(2, 9);
replace substr with slice to avoid deprecation warnings and preserve the same
9-character output by using Math.random().toString(36).slice(2, 11). Update the
id line accordingly so it still concatenates Date.now() with a 9-char random
string but uses slice instead of substr.
| async processNext(): Promise<void> { | ||
| if (this._isProcessing || this.isEmpty || this.editorEngine.chat.isStreaming) { | ||
| return; | ||
| } | ||
|
|
||
| const nextMessage = this.dequeue(); | ||
| if (!nextMessage) { | ||
| return; | ||
| } | ||
|
|
||
| this._isProcessing = true; | ||
|
|
||
| try { | ||
| await this.editorEngine.chat.sendMessage(nextMessage.content, nextMessage.type); | ||
| } catch (error) { | ||
| console.error('Error processing queued message:', error); | ||
| } finally { | ||
| this._isProcessing = false; | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tighten re-entrancy window in processNext().
Set the processing flag before dequeue to prevent parallel dequeues on rapid consecutive calls.
- async processNext(): Promise<void> {
- if (this._isProcessing || this.isEmpty || this.editorEngine.chat.isStreaming) {
- return;
- }
-
- const nextMessage = this.dequeue();
- if (!nextMessage) {
- return;
- }
-
- this._isProcessing = true;
-
- try {
- await this.editorEngine.chat.sendMessage(nextMessage.content, nextMessage.type);
- } catch (error) {
- console.error('Error processing queued message:', error);
- } finally {
- this._isProcessing = false;
- }
- }
+ async processNext(): Promise<void> {
+ if (this._isProcessing || this.isEmpty || this.editorEngine.chat.isStreaming) {
+ return;
+ }
+ this._isProcessing = true;
+ const nextMessage = this.dequeue();
+ if (!nextMessage) {
+ this._isProcessing = false;
+ return;
+ }
+ try {
+ await this.editorEngine.chat.sendMessage(nextMessage.content, nextMessage.type);
+ } catch (error) {
+ console.error('Error processing queued message:', error);
+ } finally {
+ this._isProcessing = false;
+ }
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| async processNext(): Promise<void> { | |
| if (this._isProcessing || this.isEmpty || this.editorEngine.chat.isStreaming) { | |
| return; | |
| } | |
| const nextMessage = this.dequeue(); | |
| if (!nextMessage) { | |
| return; | |
| } | |
| this._isProcessing = true; | |
| try { | |
| await this.editorEngine.chat.sendMessage(nextMessage.content, nextMessage.type); | |
| } catch (error) { | |
| console.error('Error processing queued message:', error); | |
| } finally { | |
| this._isProcessing = false; | |
| } | |
| } | |
| async processNext(): Promise<void> { | |
| if (this._isProcessing || this.isEmpty || this.editorEngine.chat.isStreaming) { | |
| return; | |
| } | |
| this._isProcessing = true; | |
| const nextMessage = this.dequeue(); | |
| if (!nextMessage) { | |
| this._isProcessing = false; | |
| return; | |
| } | |
| try { | |
| await this.editorEngine.chat.sendMessage(nextMessage.content, nextMessage.type); | |
| } catch (error) { | |
| console.error('Error processing queued message:', error); | |
| } finally { | |
| this._isProcessing = false; | |
| } | |
| } |
🤖 Prompt for AI Agents
In apps/web/client/src/components/store/editor/chat/queue.ts around lines 66 to
85, tighten the re-entrancy window by setting this._isProcessing = true before
calling dequeue so another caller cannot perform a parallel dequeue; after the
existing early-return checks, assign the flag, then call dequeue(), and if
dequeue returns null/undefined immediately clear this._isProcessing and return;
keep the try/catch/finally around sendMessage so the flag is always cleared in
finally.
| setInputValue(''); | ||
| } catch (error) { | ||
| console.error('Error sending message', error); | ||
| toast.error('Failed to send message. Please try again.'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The error handler no longer restores the input value on send failure. Consider whether you want to recover the user's input to prevent accidental text loss.
Description
Implemented a queue system that allows users to interrupt AI operations and add more context without losing the current request.
Related Issues
Closes #2866
Type of Change
Screenshots (if applicable)
Important
Introduces a message queue system for chat, allowing users to queue messages during AI streaming and process them post-streaming.
MessageQueueclass inqueue.tsto handle queuing of chat messages.ChatManagerinindex.tsto includeMessageQueueand manage message queuing.ChatInputinindex.tsxto allow message queuing during AI streaming.sendMessageinChatInputnow queues messages ifisStreamingis true.ChatInputshows number of queued messages during streaming.useChathook processes queued messages after streaming ends.ChatInputchanges icon and title during streaming to indicate queuing.This description was created by
for 939cd08. You can customize this summary. It will automatically update as commits are pushed.
Summary by CodeRabbit
New Features
Bug Fixes