Skip to content

feat: drops unsupported openai params#2195

Closed
sammaji wants to merge 1 commit into03-13-feat_litellmcompat_chat_to_responsesfrom
03-21-feat_drops_unsupported_openai_params
Closed

feat: drops unsupported openai params#2195
sammaji wants to merge 1 commit into03-13-feat_litellmcompat_chat_to_responsesfrom
03-21-feat_drops_unsupported_openai_params

Conversation

@sammaji
Copy link
Copy Markdown
Member

@sammaji sammaji commented Mar 21, 2026

Summary

Adds parameter filtering functionality to the litellmcompat plugin that automatically removes unsupported parameters from provider requests based on model catalog allowlists. This prevents provider errors when clients send parameters that specific models don't support.

Changes

  • Added GetSupportedParameters() method to ModelCatalog to retrieve supported parameter lists for models
  • Extended ModelCatalog to parse and index supported_parameters from model parameter data alongside existing supported_endpoints
  • Created parameter filtering logic in litellmcompat plugin that compares request parameters against model allowlists
  • Added context key BifrostContextKeyLiteLLMCompatDroppedParams to pass filtered parameter names to request processing
  • Implemented dropUnsupportedParams() function in provider utils that removes JSON fields listed in the context
  • Added comprehensive parameter checking for all request types (chat, responses, text completion) covering parameters like temperature, tools, response_format, etc.

Type of change

  • Feature

Affected areas

  • Core (Go)
  • Providers/Integrations
  • Plugins

How to test

Test parameter filtering by sending requests with unsupported parameters to models that have supported_parameters defined in the catalog:

# Test with a model that doesn't support 'tools' parameter
curl -X POST /v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "model-without-tools-support",
    "messages": [{"role": "user", "content": "Hello"}],
    "tools": [{"type": "function", "function": {"name": "test"}}],
    "temperature": 0.7
  }'

# Verify the tools parameter is dropped but temperature is preserved
# Check logs for parameter filtering activity

# Run existing tests
go test ./...
go test ./plugins/litellmcompat/...
go test ./framework/modelcatalog/...

Ensure model catalog data includes supported_parameters arrays for testing models.

Breaking changes

  • Yes
  • No

Security considerations

Parameter filtering only removes top-level JSON fields and doesn't modify nested structures. The filtering is based on allowlists from the model catalog, ensuring only documented parameters are preserved.

Checklist

  • I read docs/contributing/README.md and followed the guidelines
  • I added/updated tests where appropriate
  • I updated documentation where needed
  • I verified builds succeed (Go and UI)
  • I verified the CI pipeline passes locally if applicable

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Mar 21, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 988e2585-76f4-4c9e-aa7e-722ba31b4e34

📥 Commits

Reviewing files that changed from the base of the PR and between 0d26248 and 7ee6c3b.

📒 Files selected for processing (5)
  • core/providers/utils/utils.go
  • core/schemas/bifrost.go
  • framework/modelcatalog/main.go
  • plugins/litellmcompat/dropparams.go
  • plugins/litellmcompat/main.go
✅ Files skipped from review due to trivial changes (3)
  • core/providers/utils/utils.go
  • core/schemas/bifrost.go
  • plugins/litellmcompat/dropparams.go
🚧 Files skipped from review as they are similar to previous changes (2)
  • plugins/litellmcompat/main.go
  • framework/modelcatalog/main.go

📝 Walkthrough

Summary by CodeRabbit

  • New Features
    • API requests now automatically remove parameters not supported by the target model, preventing failures from incompatible parameters.
    • The model catalog now tracks per-model supported parameters to drive this filtering and improve compatibility.

Walkthrough

Adds detection of provider-unsupported request parameters per-model, records them in the context, and strips those top-level JSON fields from serialized provider requests. ModelCatalog now indexes supported parameters for lookup by the plugin.

Changes

Cohort / File(s) Summary
Model Catalog Index
framework/modelcatalog/main.go
Added supportedParams map[string][]string, populate it in buildSupportedOutputsIndex, and exposed GetSupportedParameters(model string) []string. Init/test paths allocate the new map.
Context Key & Request Filtering
core/schemas/bifrost.go, core/providers/utils/utils.go
Added BifrostContextKeyLiteLLMCompatDroppedParams constant. CheckContextAndGetRequestBody now calls dropUnsupportedParams(ctx, jsonBody); new dropUnsupportedParams removes listed top-level JSON fields with sjson.DeleteBytes.
Plugin Detection & Hook Integration
plugins/litellmcompat/dropparams.go, plugins/litellmcompat/main.go
New computeUnsupportedParams identifies set-but-unsupported params for Chat/Responses/TextCompletion variants. PreLLMHook extracts model via getModelFromRequest, queries catalog, computes droppedParams, and stores them in context for downstream removal.

Sequence Diagram

sequenceDiagram
    participant Client
    participant Plugin as LiteLLMCompat<br/>Plugin
    participant Catalog as ModelCatalog
    participant Serializer as Request<br/>Serializer
    participant Provider

    Client->>Plugin: Send BifrostRequest (includes model)
    activate Plugin
    Plugin->>Catalog: GetSupportedParameters(model)
    activate Catalog
    Catalog-->>Plugin: supportedParams
    deactivate Catalog

    Plugin->>Plugin: computeUnsupportedParams(req, supportedParams)
    Plugin->>Plugin: ctx.SetValue(BifrostContextKeyLiteLLMCompatDroppedParams, droppedParams)
    deactivate Plugin

    Plugin->>Serializer: Forward request + context
    activate Serializer
    Serializer->>Serializer: CheckContextAndGetRequestBody(ctx)
    Serializer->>Serializer: dropUnsupportedParams(ctx, jsonBody)
    Serializer->>Provider: Send filtered JSON request
    deactivate Serializer

    Provider-->>Client: Response
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Poem

🐰 I hopped through models, sniffed each param tight,
Pulled out the extras that gave them a fright,
Catalog in paw and context in tow,
I trimmed the JSON — now the requests softly go. ✨

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 70.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The title 'feat: drops unsupported openai params' accurately summarizes the main feature—automatic removal of unsupported parameters from requests using model allowlists.
Description check ✅ Passed The description follows the template well, covering summary, changes, type, affected areas, testing, and security considerations with comprehensive detail.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch 03-21-feat_drops_unsupported_openai_params

Comment @coderabbitai help to get the list of available commands and usage tips.

@sammaji sammaji mentioned this pull request Mar 21, 2026
9 tasks
Copy link
Copy Markdown
Member Author

sammaji commented Mar 21, 2026

Warning

This pull request is not mergeable via GitHub because a downstack PR is open. Once all requirements are satisfied, merge this PR as a stack on Graphite.
Learn more

This stack of pull requests is managed by Graphite. Learn more about stacking.

@sammaji sammaji mentioned this pull request Mar 21, 2026
18 tasks
@sammaji sammaji marked this pull request as ready for review March 21, 2026 06:27
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
plugins/litellmcompat/main.go (1)

80-101: ⚠️ Potential issue | 🟠 Major

Reset dropped-params context state at hook entry.

Line 97 only writes BifrostContextKeyLiteLLMCompatDroppedParams when there are drops. If PreLLMHook runs again in the same context lifecycle, the old list can persist and Line 1071 in core/providers/utils/utils.go may delete wrong fields.

💡 Proposed fix
 func (p *LiteLLMCompatPlugin) PreLLMHook(ctx *schemas.BifrostContext, req *schemas.BifrostRequest) (*schemas.BifrostRequest, *schemas.LLMPluginShortCircuit, error) {
 	// Reset context keys
 	if ctx != nil {
 		ctx.SetValue(schemas.BifrostContextKeyShouldConvertTextToChat, false)
 		ctx.SetValue(schemas.BifrostContextKeyShouldConvertChatToResponses, false)
+		ctx.SetValue(schemas.BifrostContextKeyLiteLLMCompatDroppedParams, []string(nil))
 	}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@plugins/litellmcompat/main.go` around lines 80 - 101, When entering
PreLLMHook, always clear any previous dropped-params state so stale values don't
persist: after verifying ctx != nil (the existing reset block) set
schemas.BifrostContextKeyLiteLLMCompatDroppedParams to an empty value (nil or
empty slice) before running
transformTextToChatRequest/transformChatToResponsesRequest and before
computeUnsupportedParams; update the logic around
BifrostContextKeyLiteLLMCompatDroppedParams (and related code paths using
computeUnsupportedParams, getModelFromRequest,
p.modelCatalog.GetSupportedParameters) so the key is explicitly reset on each
hook invocation.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Outside diff comments:
In `@plugins/litellmcompat/main.go`:
- Around line 80-101: When entering PreLLMHook, always clear any previous
dropped-params state so stale values don't persist: after verifying ctx != nil
(the existing reset block) set
schemas.BifrostContextKeyLiteLLMCompatDroppedParams to an empty value (nil or
empty slice) before running
transformTextToChatRequest/transformChatToResponsesRequest and before
computeUnsupportedParams; update the logic around
BifrostContextKeyLiteLLMCompatDroppedParams (and related code paths using
computeUnsupportedParams, getModelFromRequest,
p.modelCatalog.GetSupportedParameters) so the key is explicitly reset on each
hook invocation.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: e70d53c7-34dd-40a1-a9cd-dad763c01a55

📥 Commits

Reviewing files that changed from the base of the PR and between dcd8bcc and 0e500f1.

📒 Files selected for processing (5)
  • core/providers/utils/utils.go
  • core/schemas/bifrost.go
  • framework/modelcatalog/main.go
  • plugins/litellmcompat/dropparams.go
  • plugins/litellmcompat/main.go

@sammaji sammaji force-pushed the 03-21-feat_drops_unsupported_openai_params branch from 0e500f1 to 8590b1e Compare March 23, 2026 07:39
@sammaji sammaji force-pushed the 03-13-feat_litellmcompat_chat_to_responses branch 2 times, most recently from 0c27d5a to d24a2b1 Compare March 23, 2026 07:53
@sammaji sammaji force-pushed the 03-21-feat_drops_unsupported_openai_params branch from 8590b1e to 0d26248 Compare March 23, 2026 07:53
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@plugins/litellmcompat/dropparams.go`:
- Around line 142-174: The function unsupportedTextCompletionParams is missing
checks for several TextCompletionParameters fields; update
unsupportedTextCompletionParams to detect and append the missing names when they
are set but unsupported: add checks for p.BestOf != nil -> append "best_of",
p.Echo != nil -> append "echo", p.Suffix != nil -> append "suffix",
p.StreamOptions != nil -> append "stream_options", and p.User != nil -> append
"user", using the same pattern as existing checks (slices.Contains(supported,
"...") before append) so the function returns these additional dropped parameter
names.
- Around line 98-139: The function unsupportedResponsesParams is missing checks
for several ResponsesParameters fields; update it to also inspect p.Background,
p.Conversation, p.Include, p.Instructions, p.PreviousResponseID,
p.SafetyIdentifier, p.Store, p.StreamOptions, p.Truncation, and p.User and
append their corresponding string names ("background", "conversation",
"include", "instructions", "previous_response_id", "safety_identifier", "store",
"stream_options", "truncation", "user") to dropped when the pointer/slice is
non-nil or non-empty and the supported slice does not contain that name (use the
existing slices.Contains(supported, ...) pattern as used for other fields in
unsupportedResponsesParams).
- Around line 27-95: The unsupportedChatParams function is missing checks for
several ChatParameters fields so they can be dropped when unsupported; update
unsupportedChatParams to mirror existing patterns (nil checks for pointers,
len>0 for slices) and append the corresponding string keys when not present in
supported: check p.Modalities -> "modalities" (len>0), p.SafetyIdentifier ->
"safety_identifier" (!= nil), p.Store -> "store" (!= nil), p.StreamOptions ->
"stream_options" (!= nil), p.User -> "user" (!= nil), and p.WebSearchOptions ->
"web_search_options" (!= nil), following the same style as the other fields (use
slices.Contains(supported, "...") and append to dropped).
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 2821a1a1-0984-407d-a692-64fa70707ccf

📥 Commits

Reviewing files that changed from the base of the PR and between 0e500f1 and 0d26248.

📒 Files selected for processing (5)
  • core/providers/utils/utils.go
  • core/schemas/bifrost.go
  • framework/modelcatalog/main.go
  • plugins/litellmcompat/dropparams.go
  • plugins/litellmcompat/main.go
🚧 Files skipped from review as they are similar to previous changes (1)
  • core/schemas/bifrost.go

Comment on lines +27 to +95
func unsupportedChatParams(p *schemas.ChatParameters, supported []string) []string {
var dropped []string
if p.Audio != nil && !slices.Contains(supported, "audio") {
dropped = append(dropped, "audio")
}
if p.FrequencyPenalty != nil && !slices.Contains(supported, "frequency_penalty") {
dropped = append(dropped, "frequency_penalty")
}
if p.LogitBias != nil && !slices.Contains(supported, "logit_bias") {
dropped = append(dropped, "logit_bias")
}
if p.LogProbs != nil && !slices.Contains(supported, "logprobs") {
dropped = append(dropped, "logprobs")
}
if p.MaxCompletionTokens != nil && !slices.Contains(supported, "max_completion_tokens") {
dropped = append(dropped, "max_completion_tokens")
}
if p.Metadata != nil && !slices.Contains(supported, "metadata") {
dropped = append(dropped, "metadata")
}
if p.ParallelToolCalls != nil && !slices.Contains(supported, "parallel_tool_calls") {
dropped = append(dropped, "parallel_tool_calls")
}
if p.Prediction != nil && !slices.Contains(supported, "prediction") {
dropped = append(dropped, "prediction")
}
if p.PresencePenalty != nil && !slices.Contains(supported, "presence_penalty") {
dropped = append(dropped, "presence_penalty")
}
if p.PromptCacheKey != nil && !slices.Contains(supported, "prompt_cache_key") {
dropped = append(dropped, "prompt_cache_key")
}
if p.PromptCacheRetention != nil && !slices.Contains(supported, "prompt_cache_retention") {
dropped = append(dropped, "prompt_cache_retention")
}
if p.Reasoning != nil && !slices.Contains(supported, "reasoning") {
dropped = append(dropped, "reasoning")
}
if p.ResponseFormat != nil && !slices.Contains(supported, "response_format") {
dropped = append(dropped, "response_format")
}
if p.Seed != nil && !slices.Contains(supported, "seed") {
dropped = append(dropped, "seed")
}
if p.ServiceTier != nil && !slices.Contains(supported, "service_tier") {
dropped = append(dropped, "service_tier")
}
if len(p.Stop) > 0 && !slices.Contains(supported, "stop") {
dropped = append(dropped, "stop")
}
if p.Temperature != nil && !slices.Contains(supported, "temperature") {
dropped = append(dropped, "temperature")
}
if p.TopLogProbs != nil && !slices.Contains(supported, "top_logprobs") {
dropped = append(dropped, "top_logprobs")
}
if p.TopP != nil && !slices.Contains(supported, "top_p") {
dropped = append(dropped, "top_p")
}
if p.ToolChoice != nil && !slices.Contains(supported, "tool_choice") {
dropped = append(dropped, "tool_choice")
}
if len(p.Tools) > 0 && !slices.Contains(supported, "tools") {
dropped = append(dropped, "tools")
}
if p.Verbosity != nil && !slices.Contains(supported, "verbosity") {
dropped = append(dropped, "verbosity")
}
return dropped
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Missing fields in unsupportedChatParams.

Several ChatParameters fields are not checked against the allowlist. Comparing with core/schemas/chatcompletions.go (lines 49-82), the following are missing:

  • Modalities (slice) → "modalities"
  • SafetyIdentifier (pointer) → "safety_identifier"
  • Store (pointer) → "store"
  • StreamOptions (pointer) → "stream_options"
  • User (pointer) → "user"
  • WebSearchOptions (pointer) → "web_search_options"

These fields will pass through unfiltered even when unsupported by the model.

🔧 Proposed fix to add missing fields
 	if p.Verbosity != nil && !slices.Contains(supported, "verbosity") {
 		dropped = append(dropped, "verbosity")
 	}
+	if len(p.Modalities) > 0 && !slices.Contains(supported, "modalities") {
+		dropped = append(dropped, "modalities")
+	}
+	if p.SafetyIdentifier != nil && !slices.Contains(supported, "safety_identifier") {
+		dropped = append(dropped, "safety_identifier")
+	}
+	if p.Store != nil && !slices.Contains(supported, "store") {
+		dropped = append(dropped, "store")
+	}
+	if p.StreamOptions != nil && !slices.Contains(supported, "stream_options") {
+		dropped = append(dropped, "stream_options")
+	}
+	if p.User != nil && !slices.Contains(supported, "user") {
+		dropped = append(dropped, "user")
+	}
+	if p.WebSearchOptions != nil && !slices.Contains(supported, "web_search_options") {
+		dropped = append(dropped, "web_search_options")
+	}
 	return dropped
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
func unsupportedChatParams(p *schemas.ChatParameters, supported []string) []string {
var dropped []string
if p.Audio != nil && !slices.Contains(supported, "audio") {
dropped = append(dropped, "audio")
}
if p.FrequencyPenalty != nil && !slices.Contains(supported, "frequency_penalty") {
dropped = append(dropped, "frequency_penalty")
}
if p.LogitBias != nil && !slices.Contains(supported, "logit_bias") {
dropped = append(dropped, "logit_bias")
}
if p.LogProbs != nil && !slices.Contains(supported, "logprobs") {
dropped = append(dropped, "logprobs")
}
if p.MaxCompletionTokens != nil && !slices.Contains(supported, "max_completion_tokens") {
dropped = append(dropped, "max_completion_tokens")
}
if p.Metadata != nil && !slices.Contains(supported, "metadata") {
dropped = append(dropped, "metadata")
}
if p.ParallelToolCalls != nil && !slices.Contains(supported, "parallel_tool_calls") {
dropped = append(dropped, "parallel_tool_calls")
}
if p.Prediction != nil && !slices.Contains(supported, "prediction") {
dropped = append(dropped, "prediction")
}
if p.PresencePenalty != nil && !slices.Contains(supported, "presence_penalty") {
dropped = append(dropped, "presence_penalty")
}
if p.PromptCacheKey != nil && !slices.Contains(supported, "prompt_cache_key") {
dropped = append(dropped, "prompt_cache_key")
}
if p.PromptCacheRetention != nil && !slices.Contains(supported, "prompt_cache_retention") {
dropped = append(dropped, "prompt_cache_retention")
}
if p.Reasoning != nil && !slices.Contains(supported, "reasoning") {
dropped = append(dropped, "reasoning")
}
if p.ResponseFormat != nil && !slices.Contains(supported, "response_format") {
dropped = append(dropped, "response_format")
}
if p.Seed != nil && !slices.Contains(supported, "seed") {
dropped = append(dropped, "seed")
}
if p.ServiceTier != nil && !slices.Contains(supported, "service_tier") {
dropped = append(dropped, "service_tier")
}
if len(p.Stop) > 0 && !slices.Contains(supported, "stop") {
dropped = append(dropped, "stop")
}
if p.Temperature != nil && !slices.Contains(supported, "temperature") {
dropped = append(dropped, "temperature")
}
if p.TopLogProbs != nil && !slices.Contains(supported, "top_logprobs") {
dropped = append(dropped, "top_logprobs")
}
if p.TopP != nil && !slices.Contains(supported, "top_p") {
dropped = append(dropped, "top_p")
}
if p.ToolChoice != nil && !slices.Contains(supported, "tool_choice") {
dropped = append(dropped, "tool_choice")
}
if len(p.Tools) > 0 && !slices.Contains(supported, "tools") {
dropped = append(dropped, "tools")
}
if p.Verbosity != nil && !slices.Contains(supported, "verbosity") {
dropped = append(dropped, "verbosity")
}
return dropped
func unsupportedChatParams(p *schemas.ChatParameters, supported []string) []string {
var dropped []string
if p.Audio != nil && !slices.Contains(supported, "audio") {
dropped = append(dropped, "audio")
}
if p.FrequencyPenalty != nil && !slices.Contains(supported, "frequency_penalty") {
dropped = append(dropped, "frequency_penalty")
}
if p.LogitBias != nil && !slices.Contains(supported, "logit_bias") {
dropped = append(dropped, "logit_bias")
}
if p.LogProbs != nil && !slices.Contains(supported, "logprobs") {
dropped = append(dropped, "logprobs")
}
if p.MaxCompletionTokens != nil && !slices.Contains(supported, "max_completion_tokens") {
dropped = append(dropped, "max_completion_tokens")
}
if p.Metadata != nil && !slices.Contains(supported, "metadata") {
dropped = append(dropped, "metadata")
}
if p.ParallelToolCalls != nil && !slices.Contains(supported, "parallel_tool_calls") {
dropped = append(dropped, "parallel_tool_calls")
}
if p.Prediction != nil && !slices.Contains(supported, "prediction") {
dropped = append(dropped, "prediction")
}
if p.PresencePenalty != nil && !slices.Contains(supported, "presence_penalty") {
dropped = append(dropped, "presence_penalty")
}
if p.PromptCacheKey != nil && !slices.Contains(supported, "prompt_cache_key") {
dropped = append(dropped, "prompt_cache_key")
}
if p.PromptCacheRetention != nil && !slices.Contains(supported, "prompt_cache_retention") {
dropped = append(dropped, "prompt_cache_retention")
}
if p.Reasoning != nil && !slices.Contains(supported, "reasoning") {
dropped = append(dropped, "reasoning")
}
if p.ResponseFormat != nil && !slices.Contains(supported, "response_format") {
dropped = append(dropped, "response_format")
}
if p.Seed != nil && !slices.Contains(supported, "seed") {
dropped = append(dropped, "seed")
}
if p.ServiceTier != nil && !slices.Contains(supported, "service_tier") {
dropped = append(dropped, "service_tier")
}
if len(p.Stop) > 0 && !slices.Contains(supported, "stop") {
dropped = append(dropped, "stop")
}
if p.Temperature != nil && !slices.Contains(supported, "temperature") {
dropped = append(dropped, "temperature")
}
if p.TopLogProbs != nil && !slices.Contains(supported, "top_logprobs") {
dropped = append(dropped, "top_logprobs")
}
if p.TopP != nil && !slices.Contains(supported, "top_p") {
dropped = append(dropped, "top_p")
}
if p.ToolChoice != nil && !slices.Contains(supported, "tool_choice") {
dropped = append(dropped, "tool_choice")
}
if len(p.Tools) > 0 && !slices.Contains(supported, "tools") {
dropped = append(dropped, "tools")
}
if p.Verbosity != nil && !slices.Contains(supported, "verbosity") {
dropped = append(dropped, "verbosity")
}
if len(p.Modalities) > 0 && !slices.Contains(supported, "modalities") {
dropped = append(dropped, "modalities")
}
if p.SafetyIdentifier != nil && !slices.Contains(supported, "safety_identifier") {
dropped = append(dropped, "safety_identifier")
}
if p.Store != nil && !slices.Contains(supported, "store") {
dropped = append(dropped, "store")
}
if p.StreamOptions != nil && !slices.Contains(supported, "stream_options") {
dropped = append(dropped, "stream_options")
}
if p.User != nil && !slices.Contains(supported, "user") {
dropped = append(dropped, "user")
}
if p.WebSearchOptions != nil && !slices.Contains(supported, "web_search_options") {
dropped = append(dropped, "web_search_options")
}
return dropped
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@plugins/litellmcompat/dropparams.go` around lines 27 - 95, The
unsupportedChatParams function is missing checks for several ChatParameters
fields so they can be dropped when unsupported; update unsupportedChatParams to
mirror existing patterns (nil checks for pointers, len>0 for slices) and append
the corresponding string keys when not present in supported: check p.Modalities
-> "modalities" (len>0), p.SafetyIdentifier -> "safety_identifier" (!= nil),
p.Store -> "store" (!= nil), p.StreamOptions -> "stream_options" (!= nil),
p.User -> "user" (!= nil), and p.WebSearchOptions -> "web_search_options" (!=
nil), following the same style as the other fields (use
slices.Contains(supported, "...") and append to dropped).

Comment on lines +98 to +139
func unsupportedResponsesParams(p *schemas.ResponsesParameters, supported []string) []string {
var dropped []string
if p.MaxOutputTokens != nil && !slices.Contains(supported, "max_output_tokens") {
dropped = append(dropped, "max_output_tokens")
}
if p.MaxToolCalls != nil && !slices.Contains(supported, "max_tool_calls") {
dropped = append(dropped, "max_tool_calls")
}
if p.Metadata != nil && !slices.Contains(supported, "metadata") {
dropped = append(dropped, "metadata")
}
if p.ParallelToolCalls != nil && !slices.Contains(supported, "parallel_tool_calls") {
dropped = append(dropped, "parallel_tool_calls")
}
if p.PromptCacheKey != nil && !slices.Contains(supported, "prompt_cache_key") {
dropped = append(dropped, "prompt_cache_key")
}
if p.Reasoning != nil && !slices.Contains(supported, "reasoning") {
dropped = append(dropped, "reasoning")
}
if p.ServiceTier != nil && !slices.Contains(supported, "service_tier") {
dropped = append(dropped, "service_tier")
}
if p.Temperature != nil && !slices.Contains(supported, "temperature") {
dropped = append(dropped, "temperature")
}
if p.Text != nil && !slices.Contains(supported, "text") {
dropped = append(dropped, "text")
}
if p.TopLogProbs != nil && !slices.Contains(supported, "top_logprobs") {
dropped = append(dropped, "top_logprobs")
}
if p.TopP != nil && !slices.Contains(supported, "top_p") {
dropped = append(dropped, "top_p")
}
if p.ToolChoice != nil && !slices.Contains(supported, "tool_choice") {
dropped = append(dropped, "tool_choice")
}
if len(p.Tools) > 0 && !slices.Contains(supported, "tools") {
dropped = append(dropped, "tools")
}
return dropped
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Missing fields in unsupportedResponsesParams.

Comparing with core/schemas/responses.go (lines 227-254), these fields are not checked:

  • Background (pointer) → "background"
  • Conversation (pointer) → "conversation"
  • Include (slice) → "include"
  • Instructions (pointer) → "instructions"
  • PreviousResponseID (pointer) → "previous_response_id"
  • SafetyIdentifier (pointer) → "safety_identifier"
  • Store (pointer) → "store"
  • StreamOptions (pointer) → "stream_options"
  • Truncation (pointer) → "truncation"
  • User (pointer) → "user"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@plugins/litellmcompat/dropparams.go` around lines 98 - 139, The function
unsupportedResponsesParams is missing checks for several ResponsesParameters
fields; update it to also inspect p.Background, p.Conversation, p.Include,
p.Instructions, p.PreviousResponseID, p.SafetyIdentifier, p.Store,
p.StreamOptions, p.Truncation, and p.User and append their corresponding string
names ("background", "conversation", "include", "instructions",
"previous_response_id", "safety_identifier", "store", "stream_options",
"truncation", "user") to dropped when the pointer/slice is non-nil or non-empty
and the supported slice does not contain that name (use the existing
slices.Contains(supported, ...) pattern as used for other fields in
unsupportedResponsesParams).

Comment on lines +142 to +174
func unsupportedTextCompletionParams(p *schemas.TextCompletionParameters, supported []string) []string {
var dropped []string
if p.FrequencyPenalty != nil && !slices.Contains(supported, "frequency_penalty") {
dropped = append(dropped, "frequency_penalty")
}
if p.LogitBias != nil && !slices.Contains(supported, "logit_bias") {
dropped = append(dropped, "logit_bias")
}
if p.LogProbs != nil && !slices.Contains(supported, "logprobs") {
dropped = append(dropped, "logprobs")
}
if p.MaxTokens != nil && !slices.Contains(supported, "max_tokens") {
dropped = append(dropped, "max_tokens")
}
if p.N != nil && !slices.Contains(supported, "n") {
dropped = append(dropped, "n")
}
if p.PresencePenalty != nil && !slices.Contains(supported, "presence_penalty") {
dropped = append(dropped, "presence_penalty")
}
if p.Seed != nil && !slices.Contains(supported, "seed") {
dropped = append(dropped, "seed")
}
if len(p.Stop) > 0 && !slices.Contains(supported, "stop") {
dropped = append(dropped, "stop")
}
if p.Temperature != nil && !slices.Contains(supported, "temperature") {
dropped = append(dropped, "temperature")
}
if p.TopP != nil && !slices.Contains(supported, "top_p") {
dropped = append(dropped, "top_p")
}
return dropped
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Missing fields in unsupportedTextCompletionParams.

Comparing with core/schemas/textcompletions.go (lines 117-137), these fields are not checked:

  • BestOf (pointer) → "best_of"
  • Echo (pointer) → "echo"
  • Suffix (pointer) → "suffix"
  • StreamOptions (pointer) → "stream_options"
  • User (pointer) → "user"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@plugins/litellmcompat/dropparams.go` around lines 142 - 174, The function
unsupportedTextCompletionParams is missing checks for several
TextCompletionParameters fields; update unsupportedTextCompletionParams to
detect and append the missing names when they are set but unsupported: add
checks for p.BestOf != nil -> append "best_of", p.Echo != nil -> append "echo",
p.Suffix != nil -> append "suffix", p.StreamOptions != nil -> append
"stream_options", and p.User != nil -> append "user", using the same pattern as
existing checks (slices.Contains(supported, "...") before append) so the
function returns these additional dropped parameter names.

@sammaji sammaji force-pushed the 03-13-feat_litellmcompat_chat_to_responses branch from d24a2b1 to a5a5854 Compare March 23, 2026 08:15
@sammaji sammaji force-pushed the 03-21-feat_drops_unsupported_openai_params branch from 0d26248 to 7ee6c3b Compare March 23, 2026 08:15
@sammaji sammaji closed this Mar 23, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant