-
Couldn't load subscription status.
- Fork 2.4k
fix: remove topP parameter from Bedrock inference config #8388
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -45,7 +45,6 @@ import type { SingleCompletionHandler, ApiHandlerCreateMessageMetadata } from ". | |
| interface BedrockInferenceConfig { | ||
| maxTokens: number | ||
| temperature?: number | ||
| topP?: number | ||
| } | ||
|
|
||
| // Define interface for Bedrock additional model request fields | ||
|
|
@@ -374,12 +373,8 @@ export class AwsBedrockHandler extends BaseProvider implements SingleCompletionH | |
| maxTokens: modelConfig.maxTokens || (modelConfig.info.maxTokens as number), | ||
| temperature: modelConfig.temperature ?? (this.options.modelTemperature as number), | ||
| } | ||
|
|
||
| if (!thinkingEnabled) { | ||
| inferenceConfig.topP = 0.1 | ||
| } | ||
|
|
||
| // Check if 1M context is enabled for Claude Sonnet 4 / 4.5 | ||
|
|
||
| // Check if 1M context is enabled for Claude Sonnet 4 | ||
| // Use parseBaseModelId to handle cross-region inference prefixes | ||
| const baseModelId = this.parseBaseModelId(modelConfig.id) | ||
| const is1MContextEnabled = | ||
|
|
@@ -649,7 +644,6 @@ export class AwsBedrockHandler extends BaseProvider implements SingleCompletionH | |
| const inferenceConfig: BedrockInferenceConfig = { | ||
| maxTokens: modelConfig.maxTokens || (modelConfig.info.maxTokens as number), | ||
| temperature: modelConfig.temperature ?? (this.options.modelTemperature as number), | ||
| ...(thinkingEnabled ? {} : { topP: 0.1 }), // Only set topP when thinking is NOT enabled | ||
| } | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. [P2] Non-streaming path mirrors streaming config without topP. Consider adding/adjusting a unit test for completePrompt to assert that inferenceConfig does not include topP, to prevent regressions. |
||
|
|
||
| // For completePrompt, use a unique conversation ID based on the prompt | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[P1] Contract change removes topP from BedrockInferenceConfig. Ensure related tests and any call sites that asserted/relied on topP are updated (e.g., src/api/providers/tests/bedrock-reasoning.spec.ts expectations when thinking is disabled).