feat: drops unsupported openai params#2195
feat: drops unsupported openai params#2195sammaji wants to merge 1 commit into03-13-feat_litellmcompat_chat_to_responsesfrom
Conversation
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (5)
✅ Files skipped from review due to trivial changes (3)
🚧 Files skipped from review as they are similar to previous changes (2)
📝 WalkthroughSummary by CodeRabbit
WalkthroughAdds detection of provider-unsupported request parameters per-model, records them in the context, and strips those top-level JSON fields from serialized provider requests. ModelCatalog now indexes supported parameters for lookup by the plugin. Changes
Sequence DiagramsequenceDiagram
participant Client
participant Plugin as LiteLLMCompat<br/>Plugin
participant Catalog as ModelCatalog
participant Serializer as Request<br/>Serializer
participant Provider
Client->>Plugin: Send BifrostRequest (includes model)
activate Plugin
Plugin->>Catalog: GetSupportedParameters(model)
activate Catalog
Catalog-->>Plugin: supportedParams
deactivate Catalog
Plugin->>Plugin: computeUnsupportedParams(req, supportedParams)
Plugin->>Plugin: ctx.SetValue(BifrostContextKeyLiteLLMCompatDroppedParams, droppedParams)
deactivate Plugin
Plugin->>Serializer: Forward request + context
activate Serializer
Serializer->>Serializer: CheckContextAndGetRequestBody(ctx)
Serializer->>Serializer: dropUnsupportedParams(ctx, jsonBody)
Serializer->>Provider: Send filtered JSON request
deactivate Serializer
Provider-->>Client: Response
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Comment |
|
Warning This pull request is not mergeable via GitHub because a downstack PR is open. Once all requirements are satisfied, merge this PR as a stack on Graphite.
This stack of pull requests is managed by Graphite. Learn more about stacking. |
There was a problem hiding this comment.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
plugins/litellmcompat/main.go (1)
80-101:⚠️ Potential issue | 🟠 MajorReset dropped-params context state at hook entry.
Line 97 only writes
BifrostContextKeyLiteLLMCompatDroppedParamswhen there are drops. IfPreLLMHookruns again in the same context lifecycle, the old list can persist and Line 1071 incore/providers/utils/utils.gomay delete wrong fields.💡 Proposed fix
func (p *LiteLLMCompatPlugin) PreLLMHook(ctx *schemas.BifrostContext, req *schemas.BifrostRequest) (*schemas.BifrostRequest, *schemas.LLMPluginShortCircuit, error) { // Reset context keys if ctx != nil { ctx.SetValue(schemas.BifrostContextKeyShouldConvertTextToChat, false) ctx.SetValue(schemas.BifrostContextKeyShouldConvertChatToResponses, false) + ctx.SetValue(schemas.BifrostContextKeyLiteLLMCompatDroppedParams, []string(nil)) }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@plugins/litellmcompat/main.go` around lines 80 - 101, When entering PreLLMHook, always clear any previous dropped-params state so stale values don't persist: after verifying ctx != nil (the existing reset block) set schemas.BifrostContextKeyLiteLLMCompatDroppedParams to an empty value (nil or empty slice) before running transformTextToChatRequest/transformChatToResponsesRequest and before computeUnsupportedParams; update the logic around BifrostContextKeyLiteLLMCompatDroppedParams (and related code paths using computeUnsupportedParams, getModelFromRequest, p.modelCatalog.GetSupportedParameters) so the key is explicitly reset on each hook invocation.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Outside diff comments:
In `@plugins/litellmcompat/main.go`:
- Around line 80-101: When entering PreLLMHook, always clear any previous
dropped-params state so stale values don't persist: after verifying ctx != nil
(the existing reset block) set
schemas.BifrostContextKeyLiteLLMCompatDroppedParams to an empty value (nil or
empty slice) before running
transformTextToChatRequest/transformChatToResponsesRequest and before
computeUnsupportedParams; update the logic around
BifrostContextKeyLiteLLMCompatDroppedParams (and related code paths using
computeUnsupportedParams, getModelFromRequest,
p.modelCatalog.GetSupportedParameters) so the key is explicitly reset on each
hook invocation.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: e70d53c7-34dd-40a1-a9cd-dad763c01a55
📒 Files selected for processing (5)
core/providers/utils/utils.gocore/schemas/bifrost.goframework/modelcatalog/main.goplugins/litellmcompat/dropparams.goplugins/litellmcompat/main.go
0e500f1 to
8590b1e
Compare
0c27d5a to
d24a2b1
Compare
8590b1e to
0d26248
Compare
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@plugins/litellmcompat/dropparams.go`:
- Around line 142-174: The function unsupportedTextCompletionParams is missing
checks for several TextCompletionParameters fields; update
unsupportedTextCompletionParams to detect and append the missing names when they
are set but unsupported: add checks for p.BestOf != nil -> append "best_of",
p.Echo != nil -> append "echo", p.Suffix != nil -> append "suffix",
p.StreamOptions != nil -> append "stream_options", and p.User != nil -> append
"user", using the same pattern as existing checks (slices.Contains(supported,
"...") before append) so the function returns these additional dropped parameter
names.
- Around line 98-139: The function unsupportedResponsesParams is missing checks
for several ResponsesParameters fields; update it to also inspect p.Background,
p.Conversation, p.Include, p.Instructions, p.PreviousResponseID,
p.SafetyIdentifier, p.Store, p.StreamOptions, p.Truncation, and p.User and
append their corresponding string names ("background", "conversation",
"include", "instructions", "previous_response_id", "safety_identifier", "store",
"stream_options", "truncation", "user") to dropped when the pointer/slice is
non-nil or non-empty and the supported slice does not contain that name (use the
existing slices.Contains(supported, ...) pattern as used for other fields in
unsupportedResponsesParams).
- Around line 27-95: The unsupportedChatParams function is missing checks for
several ChatParameters fields so they can be dropped when unsupported; update
unsupportedChatParams to mirror existing patterns (nil checks for pointers,
len>0 for slices) and append the corresponding string keys when not present in
supported: check p.Modalities -> "modalities" (len>0), p.SafetyIdentifier ->
"safety_identifier" (!= nil), p.Store -> "store" (!= nil), p.StreamOptions ->
"stream_options" (!= nil), p.User -> "user" (!= nil), and p.WebSearchOptions ->
"web_search_options" (!= nil), following the same style as the other fields (use
slices.Contains(supported, "...") and append to dropped).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 2821a1a1-0984-407d-a692-64fa70707ccf
📒 Files selected for processing (5)
core/providers/utils/utils.gocore/schemas/bifrost.goframework/modelcatalog/main.goplugins/litellmcompat/dropparams.goplugins/litellmcompat/main.go
🚧 Files skipped from review as they are similar to previous changes (1)
- core/schemas/bifrost.go
| func unsupportedChatParams(p *schemas.ChatParameters, supported []string) []string { | ||
| var dropped []string | ||
| if p.Audio != nil && !slices.Contains(supported, "audio") { | ||
| dropped = append(dropped, "audio") | ||
| } | ||
| if p.FrequencyPenalty != nil && !slices.Contains(supported, "frequency_penalty") { | ||
| dropped = append(dropped, "frequency_penalty") | ||
| } | ||
| if p.LogitBias != nil && !slices.Contains(supported, "logit_bias") { | ||
| dropped = append(dropped, "logit_bias") | ||
| } | ||
| if p.LogProbs != nil && !slices.Contains(supported, "logprobs") { | ||
| dropped = append(dropped, "logprobs") | ||
| } | ||
| if p.MaxCompletionTokens != nil && !slices.Contains(supported, "max_completion_tokens") { | ||
| dropped = append(dropped, "max_completion_tokens") | ||
| } | ||
| if p.Metadata != nil && !slices.Contains(supported, "metadata") { | ||
| dropped = append(dropped, "metadata") | ||
| } | ||
| if p.ParallelToolCalls != nil && !slices.Contains(supported, "parallel_tool_calls") { | ||
| dropped = append(dropped, "parallel_tool_calls") | ||
| } | ||
| if p.Prediction != nil && !slices.Contains(supported, "prediction") { | ||
| dropped = append(dropped, "prediction") | ||
| } | ||
| if p.PresencePenalty != nil && !slices.Contains(supported, "presence_penalty") { | ||
| dropped = append(dropped, "presence_penalty") | ||
| } | ||
| if p.PromptCacheKey != nil && !slices.Contains(supported, "prompt_cache_key") { | ||
| dropped = append(dropped, "prompt_cache_key") | ||
| } | ||
| if p.PromptCacheRetention != nil && !slices.Contains(supported, "prompt_cache_retention") { | ||
| dropped = append(dropped, "prompt_cache_retention") | ||
| } | ||
| if p.Reasoning != nil && !slices.Contains(supported, "reasoning") { | ||
| dropped = append(dropped, "reasoning") | ||
| } | ||
| if p.ResponseFormat != nil && !slices.Contains(supported, "response_format") { | ||
| dropped = append(dropped, "response_format") | ||
| } | ||
| if p.Seed != nil && !slices.Contains(supported, "seed") { | ||
| dropped = append(dropped, "seed") | ||
| } | ||
| if p.ServiceTier != nil && !slices.Contains(supported, "service_tier") { | ||
| dropped = append(dropped, "service_tier") | ||
| } | ||
| if len(p.Stop) > 0 && !slices.Contains(supported, "stop") { | ||
| dropped = append(dropped, "stop") | ||
| } | ||
| if p.Temperature != nil && !slices.Contains(supported, "temperature") { | ||
| dropped = append(dropped, "temperature") | ||
| } | ||
| if p.TopLogProbs != nil && !slices.Contains(supported, "top_logprobs") { | ||
| dropped = append(dropped, "top_logprobs") | ||
| } | ||
| if p.TopP != nil && !slices.Contains(supported, "top_p") { | ||
| dropped = append(dropped, "top_p") | ||
| } | ||
| if p.ToolChoice != nil && !slices.Contains(supported, "tool_choice") { | ||
| dropped = append(dropped, "tool_choice") | ||
| } | ||
| if len(p.Tools) > 0 && !slices.Contains(supported, "tools") { | ||
| dropped = append(dropped, "tools") | ||
| } | ||
| if p.Verbosity != nil && !slices.Contains(supported, "verbosity") { | ||
| dropped = append(dropped, "verbosity") | ||
| } | ||
| return dropped |
There was a problem hiding this comment.
Missing fields in unsupportedChatParams.
Several ChatParameters fields are not checked against the allowlist. Comparing with core/schemas/chatcompletions.go (lines 49-82), the following are missing:
Modalities(slice) →"modalities"SafetyIdentifier(pointer) →"safety_identifier"Store(pointer) →"store"StreamOptions(pointer) →"stream_options"User(pointer) →"user"WebSearchOptions(pointer) →"web_search_options"
These fields will pass through unfiltered even when unsupported by the model.
🔧 Proposed fix to add missing fields
if p.Verbosity != nil && !slices.Contains(supported, "verbosity") {
dropped = append(dropped, "verbosity")
}
+ if len(p.Modalities) > 0 && !slices.Contains(supported, "modalities") {
+ dropped = append(dropped, "modalities")
+ }
+ if p.SafetyIdentifier != nil && !slices.Contains(supported, "safety_identifier") {
+ dropped = append(dropped, "safety_identifier")
+ }
+ if p.Store != nil && !slices.Contains(supported, "store") {
+ dropped = append(dropped, "store")
+ }
+ if p.StreamOptions != nil && !slices.Contains(supported, "stream_options") {
+ dropped = append(dropped, "stream_options")
+ }
+ if p.User != nil && !slices.Contains(supported, "user") {
+ dropped = append(dropped, "user")
+ }
+ if p.WebSearchOptions != nil && !slices.Contains(supported, "web_search_options") {
+ dropped = append(dropped, "web_search_options")
+ }
return dropped📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| func unsupportedChatParams(p *schemas.ChatParameters, supported []string) []string { | |
| var dropped []string | |
| if p.Audio != nil && !slices.Contains(supported, "audio") { | |
| dropped = append(dropped, "audio") | |
| } | |
| if p.FrequencyPenalty != nil && !slices.Contains(supported, "frequency_penalty") { | |
| dropped = append(dropped, "frequency_penalty") | |
| } | |
| if p.LogitBias != nil && !slices.Contains(supported, "logit_bias") { | |
| dropped = append(dropped, "logit_bias") | |
| } | |
| if p.LogProbs != nil && !slices.Contains(supported, "logprobs") { | |
| dropped = append(dropped, "logprobs") | |
| } | |
| if p.MaxCompletionTokens != nil && !slices.Contains(supported, "max_completion_tokens") { | |
| dropped = append(dropped, "max_completion_tokens") | |
| } | |
| if p.Metadata != nil && !slices.Contains(supported, "metadata") { | |
| dropped = append(dropped, "metadata") | |
| } | |
| if p.ParallelToolCalls != nil && !slices.Contains(supported, "parallel_tool_calls") { | |
| dropped = append(dropped, "parallel_tool_calls") | |
| } | |
| if p.Prediction != nil && !slices.Contains(supported, "prediction") { | |
| dropped = append(dropped, "prediction") | |
| } | |
| if p.PresencePenalty != nil && !slices.Contains(supported, "presence_penalty") { | |
| dropped = append(dropped, "presence_penalty") | |
| } | |
| if p.PromptCacheKey != nil && !slices.Contains(supported, "prompt_cache_key") { | |
| dropped = append(dropped, "prompt_cache_key") | |
| } | |
| if p.PromptCacheRetention != nil && !slices.Contains(supported, "prompt_cache_retention") { | |
| dropped = append(dropped, "prompt_cache_retention") | |
| } | |
| if p.Reasoning != nil && !slices.Contains(supported, "reasoning") { | |
| dropped = append(dropped, "reasoning") | |
| } | |
| if p.ResponseFormat != nil && !slices.Contains(supported, "response_format") { | |
| dropped = append(dropped, "response_format") | |
| } | |
| if p.Seed != nil && !slices.Contains(supported, "seed") { | |
| dropped = append(dropped, "seed") | |
| } | |
| if p.ServiceTier != nil && !slices.Contains(supported, "service_tier") { | |
| dropped = append(dropped, "service_tier") | |
| } | |
| if len(p.Stop) > 0 && !slices.Contains(supported, "stop") { | |
| dropped = append(dropped, "stop") | |
| } | |
| if p.Temperature != nil && !slices.Contains(supported, "temperature") { | |
| dropped = append(dropped, "temperature") | |
| } | |
| if p.TopLogProbs != nil && !slices.Contains(supported, "top_logprobs") { | |
| dropped = append(dropped, "top_logprobs") | |
| } | |
| if p.TopP != nil && !slices.Contains(supported, "top_p") { | |
| dropped = append(dropped, "top_p") | |
| } | |
| if p.ToolChoice != nil && !slices.Contains(supported, "tool_choice") { | |
| dropped = append(dropped, "tool_choice") | |
| } | |
| if len(p.Tools) > 0 && !slices.Contains(supported, "tools") { | |
| dropped = append(dropped, "tools") | |
| } | |
| if p.Verbosity != nil && !slices.Contains(supported, "verbosity") { | |
| dropped = append(dropped, "verbosity") | |
| } | |
| return dropped | |
| func unsupportedChatParams(p *schemas.ChatParameters, supported []string) []string { | |
| var dropped []string | |
| if p.Audio != nil && !slices.Contains(supported, "audio") { | |
| dropped = append(dropped, "audio") | |
| } | |
| if p.FrequencyPenalty != nil && !slices.Contains(supported, "frequency_penalty") { | |
| dropped = append(dropped, "frequency_penalty") | |
| } | |
| if p.LogitBias != nil && !slices.Contains(supported, "logit_bias") { | |
| dropped = append(dropped, "logit_bias") | |
| } | |
| if p.LogProbs != nil && !slices.Contains(supported, "logprobs") { | |
| dropped = append(dropped, "logprobs") | |
| } | |
| if p.MaxCompletionTokens != nil && !slices.Contains(supported, "max_completion_tokens") { | |
| dropped = append(dropped, "max_completion_tokens") | |
| } | |
| if p.Metadata != nil && !slices.Contains(supported, "metadata") { | |
| dropped = append(dropped, "metadata") | |
| } | |
| if p.ParallelToolCalls != nil && !slices.Contains(supported, "parallel_tool_calls") { | |
| dropped = append(dropped, "parallel_tool_calls") | |
| } | |
| if p.Prediction != nil && !slices.Contains(supported, "prediction") { | |
| dropped = append(dropped, "prediction") | |
| } | |
| if p.PresencePenalty != nil && !slices.Contains(supported, "presence_penalty") { | |
| dropped = append(dropped, "presence_penalty") | |
| } | |
| if p.PromptCacheKey != nil && !slices.Contains(supported, "prompt_cache_key") { | |
| dropped = append(dropped, "prompt_cache_key") | |
| } | |
| if p.PromptCacheRetention != nil && !slices.Contains(supported, "prompt_cache_retention") { | |
| dropped = append(dropped, "prompt_cache_retention") | |
| } | |
| if p.Reasoning != nil && !slices.Contains(supported, "reasoning") { | |
| dropped = append(dropped, "reasoning") | |
| } | |
| if p.ResponseFormat != nil && !slices.Contains(supported, "response_format") { | |
| dropped = append(dropped, "response_format") | |
| } | |
| if p.Seed != nil && !slices.Contains(supported, "seed") { | |
| dropped = append(dropped, "seed") | |
| } | |
| if p.ServiceTier != nil && !slices.Contains(supported, "service_tier") { | |
| dropped = append(dropped, "service_tier") | |
| } | |
| if len(p.Stop) > 0 && !slices.Contains(supported, "stop") { | |
| dropped = append(dropped, "stop") | |
| } | |
| if p.Temperature != nil && !slices.Contains(supported, "temperature") { | |
| dropped = append(dropped, "temperature") | |
| } | |
| if p.TopLogProbs != nil && !slices.Contains(supported, "top_logprobs") { | |
| dropped = append(dropped, "top_logprobs") | |
| } | |
| if p.TopP != nil && !slices.Contains(supported, "top_p") { | |
| dropped = append(dropped, "top_p") | |
| } | |
| if p.ToolChoice != nil && !slices.Contains(supported, "tool_choice") { | |
| dropped = append(dropped, "tool_choice") | |
| } | |
| if len(p.Tools) > 0 && !slices.Contains(supported, "tools") { | |
| dropped = append(dropped, "tools") | |
| } | |
| if p.Verbosity != nil && !slices.Contains(supported, "verbosity") { | |
| dropped = append(dropped, "verbosity") | |
| } | |
| if len(p.Modalities) > 0 && !slices.Contains(supported, "modalities") { | |
| dropped = append(dropped, "modalities") | |
| } | |
| if p.SafetyIdentifier != nil && !slices.Contains(supported, "safety_identifier") { | |
| dropped = append(dropped, "safety_identifier") | |
| } | |
| if p.Store != nil && !slices.Contains(supported, "store") { | |
| dropped = append(dropped, "store") | |
| } | |
| if p.StreamOptions != nil && !slices.Contains(supported, "stream_options") { | |
| dropped = append(dropped, "stream_options") | |
| } | |
| if p.User != nil && !slices.Contains(supported, "user") { | |
| dropped = append(dropped, "user") | |
| } | |
| if p.WebSearchOptions != nil && !slices.Contains(supported, "web_search_options") { | |
| dropped = append(dropped, "web_search_options") | |
| } | |
| return dropped | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@plugins/litellmcompat/dropparams.go` around lines 27 - 95, The
unsupportedChatParams function is missing checks for several ChatParameters
fields so they can be dropped when unsupported; update unsupportedChatParams to
mirror existing patterns (nil checks for pointers, len>0 for slices) and append
the corresponding string keys when not present in supported: check p.Modalities
-> "modalities" (len>0), p.SafetyIdentifier -> "safety_identifier" (!= nil),
p.Store -> "store" (!= nil), p.StreamOptions -> "stream_options" (!= nil),
p.User -> "user" (!= nil), and p.WebSearchOptions -> "web_search_options" (!=
nil), following the same style as the other fields (use
slices.Contains(supported, "...") and append to dropped).
| func unsupportedResponsesParams(p *schemas.ResponsesParameters, supported []string) []string { | ||
| var dropped []string | ||
| if p.MaxOutputTokens != nil && !slices.Contains(supported, "max_output_tokens") { | ||
| dropped = append(dropped, "max_output_tokens") | ||
| } | ||
| if p.MaxToolCalls != nil && !slices.Contains(supported, "max_tool_calls") { | ||
| dropped = append(dropped, "max_tool_calls") | ||
| } | ||
| if p.Metadata != nil && !slices.Contains(supported, "metadata") { | ||
| dropped = append(dropped, "metadata") | ||
| } | ||
| if p.ParallelToolCalls != nil && !slices.Contains(supported, "parallel_tool_calls") { | ||
| dropped = append(dropped, "parallel_tool_calls") | ||
| } | ||
| if p.PromptCacheKey != nil && !slices.Contains(supported, "prompt_cache_key") { | ||
| dropped = append(dropped, "prompt_cache_key") | ||
| } | ||
| if p.Reasoning != nil && !slices.Contains(supported, "reasoning") { | ||
| dropped = append(dropped, "reasoning") | ||
| } | ||
| if p.ServiceTier != nil && !slices.Contains(supported, "service_tier") { | ||
| dropped = append(dropped, "service_tier") | ||
| } | ||
| if p.Temperature != nil && !slices.Contains(supported, "temperature") { | ||
| dropped = append(dropped, "temperature") | ||
| } | ||
| if p.Text != nil && !slices.Contains(supported, "text") { | ||
| dropped = append(dropped, "text") | ||
| } | ||
| if p.TopLogProbs != nil && !slices.Contains(supported, "top_logprobs") { | ||
| dropped = append(dropped, "top_logprobs") | ||
| } | ||
| if p.TopP != nil && !slices.Contains(supported, "top_p") { | ||
| dropped = append(dropped, "top_p") | ||
| } | ||
| if p.ToolChoice != nil && !slices.Contains(supported, "tool_choice") { | ||
| dropped = append(dropped, "tool_choice") | ||
| } | ||
| if len(p.Tools) > 0 && !slices.Contains(supported, "tools") { | ||
| dropped = append(dropped, "tools") | ||
| } | ||
| return dropped |
There was a problem hiding this comment.
Missing fields in unsupportedResponsesParams.
Comparing with core/schemas/responses.go (lines 227-254), these fields are not checked:
Background(pointer) →"background"Conversation(pointer) →"conversation"Include(slice) →"include"Instructions(pointer) →"instructions"PreviousResponseID(pointer) →"previous_response_id"SafetyIdentifier(pointer) →"safety_identifier"Store(pointer) →"store"StreamOptions(pointer) →"stream_options"Truncation(pointer) →"truncation"User(pointer) →"user"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@plugins/litellmcompat/dropparams.go` around lines 98 - 139, The function
unsupportedResponsesParams is missing checks for several ResponsesParameters
fields; update it to also inspect p.Background, p.Conversation, p.Include,
p.Instructions, p.PreviousResponseID, p.SafetyIdentifier, p.Store,
p.StreamOptions, p.Truncation, and p.User and append their corresponding string
names ("background", "conversation", "include", "instructions",
"previous_response_id", "safety_identifier", "store", "stream_options",
"truncation", "user") to dropped when the pointer/slice is non-nil or non-empty
and the supported slice does not contain that name (use the existing
slices.Contains(supported, ...) pattern as used for other fields in
unsupportedResponsesParams).
| func unsupportedTextCompletionParams(p *schemas.TextCompletionParameters, supported []string) []string { | ||
| var dropped []string | ||
| if p.FrequencyPenalty != nil && !slices.Contains(supported, "frequency_penalty") { | ||
| dropped = append(dropped, "frequency_penalty") | ||
| } | ||
| if p.LogitBias != nil && !slices.Contains(supported, "logit_bias") { | ||
| dropped = append(dropped, "logit_bias") | ||
| } | ||
| if p.LogProbs != nil && !slices.Contains(supported, "logprobs") { | ||
| dropped = append(dropped, "logprobs") | ||
| } | ||
| if p.MaxTokens != nil && !slices.Contains(supported, "max_tokens") { | ||
| dropped = append(dropped, "max_tokens") | ||
| } | ||
| if p.N != nil && !slices.Contains(supported, "n") { | ||
| dropped = append(dropped, "n") | ||
| } | ||
| if p.PresencePenalty != nil && !slices.Contains(supported, "presence_penalty") { | ||
| dropped = append(dropped, "presence_penalty") | ||
| } | ||
| if p.Seed != nil && !slices.Contains(supported, "seed") { | ||
| dropped = append(dropped, "seed") | ||
| } | ||
| if len(p.Stop) > 0 && !slices.Contains(supported, "stop") { | ||
| dropped = append(dropped, "stop") | ||
| } | ||
| if p.Temperature != nil && !slices.Contains(supported, "temperature") { | ||
| dropped = append(dropped, "temperature") | ||
| } | ||
| if p.TopP != nil && !slices.Contains(supported, "top_p") { | ||
| dropped = append(dropped, "top_p") | ||
| } | ||
| return dropped |
There was a problem hiding this comment.
Missing fields in unsupportedTextCompletionParams.
Comparing with core/schemas/textcompletions.go (lines 117-137), these fields are not checked:
BestOf(pointer) →"best_of"Echo(pointer) →"echo"Suffix(pointer) →"suffix"StreamOptions(pointer) →"stream_options"User(pointer) →"user"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@plugins/litellmcompat/dropparams.go` around lines 142 - 174, The function
unsupportedTextCompletionParams is missing checks for several
TextCompletionParameters fields; update unsupportedTextCompletionParams to
detect and append the missing names when they are set but unsupported: add
checks for p.BestOf != nil -> append "best_of", p.Echo != nil -> append "echo",
p.Suffix != nil -> append "suffix", p.StreamOptions != nil -> append
"stream_options", and p.User != nil -> append "user", using the same pattern as
existing checks (slices.Contains(supported, "...") before append) so the
function returns these additional dropped parameter names.
d24a2b1 to
a5a5854
Compare
0d26248 to
7ee6c3b
Compare

Summary
Adds parameter filtering functionality to the litellmcompat plugin that automatically removes unsupported parameters from provider requests based on model catalog allowlists. This prevents provider errors when clients send parameters that specific models don't support.
Changes
GetSupportedParameters()method to ModelCatalog to retrieve supported parameter lists for modelssupported_parametersfrom model parameter data alongside existingsupported_endpointsBifrostContextKeyLiteLLMCompatDroppedParamsto pass filtered parameter names to request processingdropUnsupportedParams()function in provider utils that removes JSON fields listed in the contexttemperature,tools,response_format, etc.Type of change
Affected areas
How to test
Test parameter filtering by sending requests with unsupported parameters to models that have
supported_parametersdefined in the catalog:Ensure model catalog data includes
supported_parametersarrays for testing models.Breaking changes
Security considerations
Parameter filtering only removes top-level JSON fields and doesn't modify nested structures. The filtering is based on allowlists from the model catalog, ensuring only documented parameters are preserved.
Checklist
docs/contributing/README.mdand followed the guidelines