-
Notifications
You must be signed in to change notification settings - Fork 2.3k
fix: preserve context by retrying with full conversation on invalid previous_response_id #7714
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…revious_response_id When the OpenAI Responses API returns a 400 error due to an invalid or expired previous_response_id, the code now properly re-prepares the full conversation history for the retry instead of just removing the ID and sending only the latest message. This fixes a critical bug where conversation context was completely lost when continuity failed, leading to degraded responses. Changes: - Modified executeRequest() to re-prepare input with full conversation on retry - Modified makeGpt5ResponsesAPIRequest() with the same fix for SSE fallback - Added comprehensive test coverage for both SDK and SSE retry paths - Tests verify retry sends full conversation, not just latest message
| // Clear the stored lastResponseId to prevent using it again | ||
| this.lastResponseId = undefined | ||
|
|
||
| // Re-prepare the full conversation without previous_response_id |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good fix: when a 400 error occurs due to an invalid previous_response_id, the code now clears it and re-prepares the full conversation. Consider extracting the duplicated logic for re-preparing the conversation (lines ~327–339 and ~517–521) into a helper method to reduce repetition and ease future maintenance.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for this important fix! The solution correctly addresses the context loss issue when previous_response_id becomes invalid. I've reviewed the changes and have some suggestions to improve code maintainability.
| // If we have the original messages, re-prepare the full conversation | ||
| if (systemPrompt && messages) { | ||
| const { formattedInput } = this.prepareStructuredInput(systemPrompt, messages, undefined) | ||
| retryRequestBody.input = formattedInput |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree with the existing suggestion - this duplicated logic for re-preparing the conversation (here and around line 517-521) should be extracted into a helper method. Consider something like:
| retryRequestBody.input = formattedInput | |
| private prepareRetryRequestBody(requestBody: any, systemPrompt?: string, messages?: Anthropic.Messages.MessageParam[]): any { | |
| const retryRequestBody = { ...requestBody }; | |
| delete retryRequestBody.previous_response_id; | |
| if (systemPrompt && messages) { | |
| const { formattedInput } = this.prepareStructuredInput(systemPrompt, messages, undefined); | |
| retryRequestBody.input = formattedInput; | |
| } | |
| return retryRequestBody; | |
| } |
|
|
||
| if (is400Error && requestBody.previous_response_id && isPreviousResponseError) { | ||
| // Log the error and retry without the previous_response_id | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The comment mentions logging but no actual logging is implemented. Consider adding proper error logging here for debugging purposes:
| // Log the error and retry without the previous_response_id | |
| console.warn('Previous response ID not found, retrying with full conversation:', errorMessage); |
|
|
||
| if (response.status === 400 && requestBody.previous_response_id && isPreviousResponseError) { | ||
| // Log the error and retry without the previous_response_id | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same logging issue here - consider adding actual error logging:
| // Log the error and retry without the previous_response_id | |
| console.warn('Previous response ID not found (SSE path), retrying with full conversation:', errorDetails); |
| role: "user", | ||
| content: [{ type: "input_text", text: "What number did I ask you to remember?" }], | ||
| }, | ||
| ]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great test coverage! Consider adding one more test case for when both the initial request AND the retry fail, to ensure proper error propagation in that scenario.
Description
This PR fixes a critical regression in the OpenAI Responses API where conversation context was lost after a continuity failure.
The Problem
When a
previous_response_idbecomes invalid or expired (400 error), the current code on main:The Solution
This PR ensures that when
previous_response_idfails:previous_response_idfrom the requestChanges
executeRequest()to re-prepare input with full conversation on retrymakeGpt5ResponsesAPIRequest()with the same fix for SSE fallbackTest Procedure
Run the tests:
Expected: All 36 tests pass, including the new test that verifies:
previous_response_idsends only latest messageImpact
Important
Fixes context loss by retrying with full conversation history on invalid
previous_response_iderrors in OpenAI Responses API.previous_response_idis invalid inexecuteRequest()andmakeGpt5ResponsesAPIRequest().previous_response_idand resends full conversation.openai-native.spec.tsto verify retry logic sends full conversation on 400 error.previous_response_id, second with full history.This description was created by
for 8d4a77f. You can customize this summary. It will automatically update as commits are pushed.