Skip to content

Conversation

@Rish-it
Copy link
Contributor

@Rish-it Rish-it commented Sep 22, 2025

Description

Implemented a queue system that allows users to interrupt AI operations and add more context without losing the current request.
Related Issues

Closes #2866
Type of Change

Screenshots (if applicable)

Screenshot 2025-09-22 at 9 48 21 AM

Important

Introduces a message queue system for chat, allowing users to queue messages during AI streaming and process them post-streaming.

  • New Features:
    • Implemented MessageQueue class in queue.ts to handle queuing of chat messages.
    • Updated ChatManager in index.ts to include MessageQueue and manage message queuing.
    • Modified ChatInput in index.tsx to allow message queuing during AI streaming.
  • Behavior Changes:
    • sendMessage in ChatInput now queues messages if isStreaming is true.
    • Placeholder in ChatInput shows number of queued messages during streaming.
    • useChat hook processes queued messages after streaming ends.
  • UI Updates:
    • Send button in ChatInput changes icon and title during streaming to indicate queuing.
    • Input is cleared on send and restored if sending fails.

This description was created by Ellipsis for 939cd08. You can customize this summary. It will automatically update as commits are pushed.


Summary by CodeRabbit

  • New Features

    • You can now keep typing and queue messages while the assistant is streaming.
    • Send button adapts during streaming (queue indicator, updated icon/title) and only disables when the input is empty.
    • Placeholder shows the number of queued messages during streaming.
  • Bug Fixes

    • Improved reliability so queued messages are automatically sent after streaming ends.
    • Input is cleared on send and restored if sending fails, preventing accidental text loss.

@vercel
Copy link

vercel bot commented Sep 22, 2025

@Rish-it is attempting to deploy a commit to the Onlook Team on Vercel.

A member of the Team first needs to authorize it.

Comment on lines 140 to 142
const savedInput = inputValue.trim();
setInputValue('');

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider moving the setInputValue('') call inside the try block after the message has been successfully sent or queued. The current implementation clears the input before confirming the operation succeeded, which means if an error occurs during sending or queueing, the user's input will be lost. This creates a poor user experience if the operation fails.

Suggested change
const savedInput = inputValue.trim();
setInputValue('');
const savedInput = inputValue.trim();
// setInputValue('') will be moved inside the try block after successful send

Spotted by Diamond

Fix in Graphite


Is this helpful? React 👍 or 👎 to let us know.

@Rish-it Rish-it changed the title queue added feat: add more context to chat Sep 22, 2025
@coderabbitai
Copy link

coderabbitai bot commented Sep 22, 2025

Note

Other AI code review bot(s) detected

CodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review.

Walkthrough

Adds a message queue to the chat system and integrates it with ChatManager. Updates chat input to allow typing during streaming and to enqueue messages when streaming. Adjusts use-chat hook to process queued messages after streaming and simplifies context cleanup. Introduces a new queue module with enqueue/dequeue and processing.

Changes

Cohort / File(s) Summary of Changes
Chat Input behavior
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
Enables input during streaming; sends immediately when not streaming; enqueues when streaming; clears input on submit with error-based restoration; dynamic placeholder showing queue count; send button title/icon reflect queue vs send; disable only when input empty.
Chat hook flow
apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
Makes context cleanup synchronous; after 500ms delay triggers editorEngine.chat.queue.processNext(); expands effect dependencies; retains suggestions/commit flow.
Chat system core
apps/web/client/src/components/store/editor/chat/index.ts
Adds queue: MessageQueue to ChatManager; initializes in constructor; clears in clear(); imports queue module.
Message Queue module
apps/web/client/src/components/store/editor/chat/queue.ts
New MessageQueue with MobX state: enqueue/dequeue/remove/clear; guarded processNext/processAll calling editorEngine.chat.sendMessage; exposes QueuedMessage interface and getters for length/processing/emptiness.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor User
  participant ChatInput
  participant UseChat
  participant ChatManager as ChatManager
  participant MessageQueue as MessageQueue
  participant Engine as EditorEngine

  User->>ChatInput: Type + Press Send
  alt Streaming active
    ChatInput->>ChatManager: chat.queue.enqueue(content, mode)
    ChatInput-->>User: Clear input, show queued count
  else Not streaming
    ChatInput->>ChatManager: chat.sendMessage(content, mode)
  end

  Note over UseChat,MessageQueue: After streaming completes
  UseChat->>MessageQueue: processNext() (delayed)
  loop Until queue empty or streaming resumes
    MessageQueue->>ChatManager: sendMessage(next.content, next.type)
    ChatManager->>Engine: Dispatch to AI
    Engine-->>ChatManager: Response streaming
  end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

Suggested reviewers

  • Kitenite

Poem

I tap my paws—the stream still flows,
I queue my thoughts in tidy rows.
A whisker twitch, a message hops—
When currents pause, the next one pops.
In burrows deep, the backlog sings;
Send, nibble, send—on springy springs. 🐇✨

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (4 passed)
Check name Status Explanation
Title Check ✅ Passed The title "feat: add more context to chat" is concise and accurately reflects the primary change—enabling users to add context mid-chat via a queuing mechanism—by focusing on the feature intent and using the conventional "feat:" prefix; it aligns with the main changes to MessageQueue, ChatInput, and useChat and is clear for teammates scanning history.
Linked Issues Check ✅ Passed The changes implement the core coding objectives from issue #2866: allowing users to insert additional context during an ongoing AI response (ChatInput now enqueues messages while isStreaming), enabling interruption/queuing behavior (MessageQueue added and exposed via ChatManager), and scheduling processing of queued messages after the current response (useChat invokes queue processing); these map directly to the issue goals of "insert more into the conversation" and "interrupt the AI with more context" [#2866].
Out of Scope Changes Check ✅ Passed All modified files are within the chat subsystem (ChatInput, use-chat hook, ChatManager store, and the new MessageQueue) and UI updates to button/placeholder behavior are directly related to the queuing feature, so I find no unrelated or out-of-scope changes based on the provided summaries; the minor change making cleanupContext synchronous appears tied to queue processing timing rather than an unrelated feature.
Description Check ✅ Passed The PR description contains a clear Description, links the related issue (Closes #2866), and provides a detailed behavioral summary plus a screenshot, but it does not fill the repository template's "Testing" section or mark the "Type of Change" checkboxes, and "Additional Notes" is not explicitly populated; overall it is informative but missing those required template items needed for reviewers.
✨ Finishing touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

🧹 Nitpick comments (6)
apps/web/client/src/components/store/editor/chat/queue.ts (1)

36-46: Guard against enqueueing empty/whitespace messages.

Today the callsite filters, but this API should be safe by itself.

-    enqueue(content: string, type: ChatType): QueuedMessage {
+    enqueue(content: string, type: ChatType): QueuedMessage {
+        const trimmed = content.trim();
+        if (!trimmed) {
+            throw new Error('Cannot enqueue empty message');
+        }
         const message: QueuedMessage = {
-            content: content.trim(),
+            content: trimmed,
             type,
-            id: `queued_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`,
+            id: `queued_${Date.now()}_${Math.random().toString(36).slice(2, 11)}`,
             timestamp: Date.now(),
         };
apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx (2)

219-223: Process the queue immediately; avoid arbitrary 500ms delay.

The queue already guards on isStreaming. Fire now and optionally clean up if this effect re-runs.

-            // Process any queued messages
-            setTimeout(() => {
-                void editorEngine.chat.queue.processNext();
-            }, 500);
+            // Process any queued messages
+            void editorEngine.chat.queue.processNext();

Optionally add a cleanup if you keep a timer.


224-225: Trim effect dependencies.

Including stable objects (queue/context/versions) adds noise without benefit here. Depend on finishReason and ids.

-    }, [finishReason, conversationId, editorEngine.chat.queue, editorEngine.chat.context, editorEngine.versions, setMessages]);
+    }, [finishReason, conversationId, setMessages]);
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx (3)

96-98: Hard-disable flag no longer needed as a variable.

Since it’s always false, inline it and remove the prop plumbing to reduce noise.

-    const disabled = false; // Allow input while streaming
+    // Input stays enabled while streaming

And pass disabled={false} or drop the prop where it defaults to enabled.


143-151: Don’t log user message content.

Console logging queued messages may expose sensitive text in shared environments.

-                const queuedMessage = editorEngine.chat.queue.enqueue(savedInput, chatMode);
-                console.log('Message queued:', queuedMessage);
+                editorEngine.chat.queue.enqueue(savedInput, chatMode);

Also set the input to '' after success:

             } else {
                 // Send immediately if not streaming
                 await onSendMessage(savedInput, chatMode);
             }
+            setInputValue('');

436-441: Remove unreachable isStreaming branches in the “Send” button.

This block only renders when !isStreaming.

-                            disabled={inputEmpty}
+                            disabled={inputEmpty}
                             onClick={() => void sendMessage()}
-                            title={isStreaming ? 'Queue message' : 'Send message'}
+                            title="Send message"
                         >
-                            {isStreaming ? <Icons.CounterClockwiseClock /> : <Icons.ArrowRight />}
+                            <Icons.ArrowRight />

Consider localizing the title as well.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between b752c2b and 7315b9a.

📒 Files selected for processing (4)
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx (4 hunks)
  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx (1 hunks)
  • apps/web/client/src/components/store/editor/chat/index.ts (3 hunks)
  • apps/web/client/src/components/store/editor/chat/queue.ts (1 hunks)
🧰 Additional context used
📓 Path-based instructions (6)
apps/web/client/src/app/**/*.tsx

📄 CodeRabbit inference engine (AGENTS.md)

apps/web/client/src/app/**/*.tsx: Default to Server Components; add 'use client' when using events, state/effects, browser APIs, or client‑only libraries
Do not use process.env in client code; import env from @/env instead

Avoid hardcoded user-facing text; use next-intl messages/hooks

Files:

  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
apps/web/client/src/**/*.{ts,tsx}

📄 CodeRabbit inference engine (AGENTS.md)

apps/web/client/src/**/*.{ts,tsx}: Use path aliases @/* and ~/* for imports that map to apps/web/client/src/*
Avoid hardcoded user-facing text; use next-intl messages/hooks instead

Use path aliases @/* and ~/* for imports mapping to src/*

Files:

  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
  • apps/web/client/src/components/store/editor/chat/queue.ts
  • apps/web/client/src/components/store/editor/chat/index.ts
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
apps/web/client/src/**/*.tsx

📄 CodeRabbit inference engine (AGENTS.md)

apps/web/client/src/**/*.tsx: Create MobX store instances with useState(() => new Store()) for stable references across renders
Keep the active MobX store in a useRef and perform async cleanup with setTimeout(() => storeRef.current?.clear(), 0) to avoid route-change races
Avoid useMemo for creating MobX store instances
Avoid putting the MobX store instance in effect dependency arrays if it causes loops; split concerns by domain

apps/web/client/src/**/*.tsx: Create MobX store instances with useState(() => new Store()) for stable identities across renders
Keep the active MobX store in a useRef and clean up asynchronously with setTimeout(() => storeRef.current?.clear(), 0)
Do not use useMemo to create MobX stores
Avoid placing MobX store instances in effect dependency arrays if it causes loops; split concerns instead
observer components must be client components; place a single client boundary at the feature entry; child observers need not repeat 'use client'

Files:

  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
**/*.{ts,tsx}

📄 CodeRabbit inference engine (AGENTS.md)

Do not use the any type unless necessary

Files:

  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
  • apps/web/client/src/components/store/editor/chat/queue.ts
  • apps/web/client/src/components/store/editor/chat/index.ts
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
apps/web/client/src/app/**/*.{ts,tsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Default to Server Components; add 'use client' only when using events, state/effects, browser APIs, or client-only libs

Files:

  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
{apps,packages}/**/*.{ts,tsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Avoid using the any type unless absolutely necessary

Files:

  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
  • apps/web/client/src/components/store/editor/chat/queue.ts
  • apps/web/client/src/components/store/editor/chat/index.ts
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
🧬 Code graph analysis (3)
apps/web/client/src/components/store/editor/chat/queue.ts (1)
apps/web/client/src/components/store/editor/engine.ts (1)
  • EditorEngine (33-138)
apps/web/client/src/components/store/editor/chat/index.ts (1)
apps/web/client/src/components/store/editor/chat/queue.ts (1)
  • MessageQueue (12-92)
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx (2)
apps/web/client/src/components/store/editor/chat/index.ts (1)
  • sendMessage (46-52)
packages/ui/src/components/icons/index.tsx (1)
  • Icons (138-3667)
🔇 Additional comments (4)
apps/web/client/src/components/store/editor/chat/index.ts (2)

22-23: Queue init in ChatManager constructor looks good.


54-58: Good: clearing queue with other chat state.

apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx (1)

140-142: Defer clearing the input until success.

Avoid losing the text if queue/send fails. This mirrors prior feedback.

-        const savedInput = inputValue.trim();
-        setInputValue('');
+        const savedInput = inputValue.trim();

Move setInputValue('') into the success path (after enqueue/send).

apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx (1)

211-213: clearImagesFromContext is synchronous — no await required.
Definition at apps/web/client/src/components/store/editor/chat/context.ts (lines 311–313) is non-async (filters this.context); the call in apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx (lines 210–213) is fine.

Comment on lines 154 to +165
const getPlaceholderText = () => {
if (isStreaming && editorEngine.chat.queue.length > 0) {
return `${editorEngine.chat.queue.length} message${editorEngine.chat.queue.length > 1 ? 's' : ''} queued - type to add more...`;
}
if (isStreaming) {
return 'Type to queue message while AI responds...';
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Localize new placeholders (next-intl).

Avoid hardcoded user-facing strings; add keys with pluralization.

-        if (isStreaming && editorEngine.chat.queue.length > 0) {
-            return `${editorEngine.chat.queue.length} message${editorEngine.chat.queue.length > 1 ? 's' : ''} queued - type to add more...`;
-        }
-        if (isStreaming) {
-            return 'Type to queue message while AI responds...';
-        }
+        if (isStreaming && editorEngine.chat.queue.length > 0) {
+            return t(transKeys.editor.chat.input.placeholder.streamingQueued, {
+                count: editorEngine.chat.queue.length,
+            });
+        }
+        if (isStreaming) {
+            return t(transKeys.editor.chat.input.placeholder.streaming);
+        }

Please add the corresponding keys to your messages file.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const getPlaceholderText = () => {
if (isStreaming && editorEngine.chat.queue.length > 0) {
return `${editorEngine.chat.queue.length} message${editorEngine.chat.queue.length > 1 ? 's' : ''} queued - type to add more...`;
}
if (isStreaming) {
return 'Type to queue message while AI responds...';
}
const getPlaceholderText = () => {
if (isStreaming && editorEngine.chat.queue.length > 0) {
return t(transKeys.editor.chat.input.placeholder.streamingQueued, {
count: editorEngine.chat.queue.length,
});
}
if (isStreaming) {
return t(transKeys.editor.chat.input.placeholder.streaming);
}
🤖 Prompt for AI Agents
In
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
around lines 159 to 165, the placeholder strings are hardcoded; replace them
with next-intl message lookups (use intl.formatMessage or <FormattedMessage>)
and add pluralization for the queued count (e.g., a plural key like
"chat.input.queued" with forms for 0/one/other). Add corresponding keys to the
project messages file(s) (including plural variants and the "typing while
responding" key) and update the component to use those keys instead of literal
text so translations can be provided.

import type { EditorEngine } from '../engine';
import { ChatContext } from './context';
import { ConversationManager } from './conversation';
import { MessageQueue } from './queue';
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Use path alias for internal imports.

Align with configured @/* aliases.

-import { MessageQueue } from './queue';
+import { MessageQueue } from '@/components/store/editor/chat/queue';
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
import { MessageQueue } from './queue';
import { MessageQueue } from '@/components/store/editor/chat/queue';
🤖 Prompt for AI Agents
In apps/web/client/src/components/store/editor/chat/index.ts around line 7, the
import uses a relative path "import { MessageQueue } from './queue';" but
project prefers the configured @/* path aliases; update the import to use the
alias (e.g. import { MessageQueue } from '@/components/store/editor/chat/queue')
matching your tsconfig/webpack alias and adjust the path to the correct aliased
location so linting/building uses the internal @/ alias.

@@ -0,0 +1,92 @@
import { type ChatType } from '@onlook/models';
import { makeAutoObservable } from 'mobx';
import type { EditorEngine } from '../engine';
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Use path alias for internal imports.

Replace relative import with the configured alias per guidelines.

-import type { EditorEngine } from '../engine';
+import type { EditorEngine } from '@/components/store/editor/engine';
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
import type { EditorEngine } from '../engine';
import type { EditorEngine } from '@/components/store/editor/engine';
🤖 Prompt for AI Agents
In apps/web/client/src/components/store/editor/chat/queue.ts around line 3,
replace the relative import "import type { EditorEngine } from '../engine';"
with the project's configured path alias (e.g. use the alias configured in
tsconfig like "@/components/store/editor/engine" or the appropriate alias for
this repo) so internal imports follow the guidelines; update the import path to
the aliased module and ensure TypeScript resolves the alias by using the exact
alias name used in the repo.

Comment on lines +16 to +18
constructor(private editorEngine: EditorEngine) {
makeAutoObservable(this);
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Don’t make EditorEngine observable.

Avoid deep observability/memory churn by excluding heavy refs from MobX.

-    constructor(private editorEngine: EditorEngine) {
-        makeAutoObservable(this);
-    }
+    constructor(private editorEngine: EditorEngine) {
+        makeAutoObservable(this, { editorEngine: false });
+    }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
constructor(private editorEngine: EditorEngine) {
makeAutoObservable(this);
}
constructor(private editorEngine: EditorEngine) {
makeAutoObservable(this, { editorEngine: false });
}
🤖 Prompt for AI Agents
In apps/web/client/src/components/store/editor/chat/queue.ts around lines 16 to
18, the constructor currently calls makeAutoObservable(this) which makes the
heavy EditorEngine reference fully observable; change the call to exclude that
property from observability by calling makeAutoObservable(this, { editorEngine:
false }) (or the equivalent makeObservable wiring) so EditorEngine remains a
plain non-observable field and avoid deep observability/memory churn.

Comment on lines +40 to +41
id: `queued_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`,
timestamp: Date.now(),
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Replace deprecated substr().

Use slice() to avoid deprecation warnings.

-            id: `queued_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`,
+            id: `queued_${Date.now()}_${Math.random().toString(36).slice(2, 11)}`,
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
id: `queued_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`,
timestamp: Date.now(),
id: `queued_${Date.now()}_${Math.random().toString(36).slice(2, 11)}`,
timestamp: Date.now(),
🤖 Prompt for AI Agents
In apps/web/client/src/components/store/editor/chat/queue.ts around lines 40-41,
the id generation uses the deprecated Math.random().toString(36).substr(2, 9);
replace substr with slice to avoid deprecation warnings and preserve the same
9-character output by using Math.random().toString(36).slice(2, 11). Update the
id line accordingly so it still concatenates Date.now() with a 9-char random
string but uses slice instead of substr.

Comment on lines 66 to 85
async processNext(): Promise<void> {
if (this._isProcessing || this.isEmpty || this.editorEngine.chat.isStreaming) {
return;
}

const nextMessage = this.dequeue();
if (!nextMessage) {
return;
}

this._isProcessing = true;

try {
await this.editorEngine.chat.sendMessage(nextMessage.content, nextMessage.type);
} catch (error) {
console.error('Error processing queued message:', error);
} finally {
this._isProcessing = false;
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Tighten re-entrancy window in processNext().

Set the processing flag before dequeue to prevent parallel dequeues on rapid consecutive calls.

-    async processNext(): Promise<void> {
-        if (this._isProcessing || this.isEmpty || this.editorEngine.chat.isStreaming) {
-            return;
-        }
-
-        const nextMessage = this.dequeue();
-        if (!nextMessage) {
-            return;
-        }
-
-        this._isProcessing = true;
-        
-        try {
-            await this.editorEngine.chat.sendMessage(nextMessage.content, nextMessage.type);
-        } catch (error) {
-            console.error('Error processing queued message:', error);
-        } finally {
-            this._isProcessing = false;
-        }
-    }
+    async processNext(): Promise<void> {
+        if (this._isProcessing || this.isEmpty || this.editorEngine.chat.isStreaming) {
+            return;
+        }
+        this._isProcessing = true;
+        const nextMessage = this.dequeue();
+        if (!nextMessage) {
+            this._isProcessing = false;
+            return;
+        }
+        try {
+            await this.editorEngine.chat.sendMessage(nextMessage.content, nextMessage.type);
+        } catch (error) {
+            console.error('Error processing queued message:', error);
+        } finally {
+            this._isProcessing = false;
+        }
+    }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
async processNext(): Promise<void> {
if (this._isProcessing || this.isEmpty || this.editorEngine.chat.isStreaming) {
return;
}
const nextMessage = this.dequeue();
if (!nextMessage) {
return;
}
this._isProcessing = true;
try {
await this.editorEngine.chat.sendMessage(nextMessage.content, nextMessage.type);
} catch (error) {
console.error('Error processing queued message:', error);
} finally {
this._isProcessing = false;
}
}
async processNext(): Promise<void> {
if (this._isProcessing || this.isEmpty || this.editorEngine.chat.isStreaming) {
return;
}
this._isProcessing = true;
const nextMessage = this.dequeue();
if (!nextMessage) {
this._isProcessing = false;
return;
}
try {
await this.editorEngine.chat.sendMessage(nextMessage.content, nextMessage.type);
} catch (error) {
console.error('Error processing queued message:', error);
} finally {
this._isProcessing = false;
}
}
🤖 Prompt for AI Agents
In apps/web/client/src/components/store/editor/chat/queue.ts around lines 66 to
85, tighten the re-entrancy window by setting this._isProcessing = true before
calling dequeue so another caller cannot perform a parallel dequeue; after the
existing early-return checks, assign the flag, then call dequeue(), and if
dequeue returns null/undefined immediately clear this._isProcessing and return;
keep the try/catch/finally around sendMessage so the flag is always cleared in
finally.

@Rish-it Rish-it closed this Sep 23, 2025
@Kitenite Kitenite reopened this Sep 23, 2025
setInputValue('');
} catch (error) {
console.error('Error sending message', error);
toast.error('Failed to send message. Please try again.');
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error handler no longer restores the input value on send failure. Consider whether you want to recover the user's input to prevent accidental text loss.

@Kitenite Kitenite closed this Sep 30, 2025
@Rish-it Rish-it deleted the feat--add-more-context branch October 2, 2025 04:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[feat] Allow the user to interrupt the AI with more context

2 participants