Skip to content

fix: add store to OPENAI_CHAT_COMPLETION_PARAMS#21195

Merged
1 commit merged intoBerriAI:litellm_oss_staging_02_17_2026from
namabile:fix/additional-drop-params-non-openai
Feb 17, 2026
Merged

fix: add store to OPENAI_CHAT_COMPLETION_PARAMS#21195
1 commit merged intoBerriAI:litellm_oss_staging_02_17_2026from
namabile:fix/additional-drop-params-non-openai

Conversation

@namabile
Copy link
Copy Markdown
Contributor

Summary

The OpenAI store parameter (docs) was missing from OPENAI_CHAT_COMPLETION_PARAMS in litellm/constants.py.

This caused store to be unrecognized by helper functions that rely on this list (get_standard_openai_params(), litellm_proxy provider config). More critically, code paths using OPENAI_CHAT_COMPLETION_PARAMS instead of DEFAULT_CHAT_COMPLETION_PARAM_VALUES would treat store as a provider-specific parameter and forward it to non-OpenAI providers like Anthropic, causing:

store: Extra inputs are not permitted

Changes

  • litellm/constants.py: Added "store" to OPENAI_CHAT_COMPLETION_PARAMS (it was already in DEFAULT_CHAT_COMPLETION_PARAM_VALUES)
  • tests/llm_translation/test_optional_params.py: Added 3 tests:
    • test_drop_store_param_for_anthropic — verifies drop_params=True strips store for Anthropic
    • test_additional_drop_params_store_for_anthropic — verifies additional_drop_params=["store"] works
    • test_store_in_openai_chat_completion_params — verifies store is in the params list and recognized by get_standard_openai_params()

Context

Discovered while running OpenClaw through a LiteLLM proxy to Anthropic. OpenClaw sends store=true in its OpenAI-compatible requests, which leaked through to Anthropic and was rejected.

Fixes #19700

The OpenAI `store` parameter (used for storing completions for
distillation/evals) was missing from `OPENAI_CHAT_COMPLETION_PARAMS`.

This caused it to be unrecognized by `get_standard_openai_params()` and
the `litellm_proxy` provider config. It also meant that code paths using
this list (rather than `DEFAULT_CHAT_COMPLETION_PARAM_VALUES`) would
treat `store` as a provider-specific parameter and forward it to
non-OpenAI providers like Anthropic, resulting in:

    "store: Extra inputs are not permitted"

Fixes BerriAI#19700
@vercel
Copy link
Copy Markdown

vercel bot commented Feb 14, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
litellm Error Error Feb 14, 2026 7:17am

Request Review

@CLAassistant
Copy link
Copy Markdown

CLAassistant commented Feb 14, 2026

CLA assistant check
All committers have signed the CLA.

@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps bot commented Feb 14, 2026

Greptile Overview

Greptile Summary

Adds "store" to OPENAI_CHAT_COMPLETION_PARAMS in litellm/constants.py, fixing an inconsistency where store was already present in DEFAULT_CHAT_COMPLETION_PARAM_VALUES but missing from the params list. This caused get_standard_openai_params() and LiteLLMProxyChatConfig to not recognize store as a standard OpenAI parameter, leading to it being forwarded to non-OpenAI providers (e.g. Anthropic) and causing "Extra inputs are not permitted" errors.

  • Bug fix: Added "store" to OPENAI_CHAT_COMPLETION_PARAMS constant list
  • Tests: 3 new tests verify store is dropped for Anthropic with drop_params=True, additional_drop_params=["store"], and is recognized as a standard OpenAI param
  • Whitespace: Minor trailing whitespace cleanup in existing test functions

Fixes #19700

Confidence Score: 5/5

  • This PR is safe to merge — it is a minimal, correct one-line constant addition with comprehensive tests.
  • The change is a single string addition to a constant list, bringing it in line with the already-existing entry in DEFAULT_CHAT_COMPLETION_PARAM_VALUES. The fix is well-motivated by a real issue ([Bug]: drop_params field doesn;t work #19700), thoroughly tested with 3 new tests, and introduces zero risk of regression.
  • No files require special attention.

Important Files Changed

Filename Overview
litellm/constants.py Adds "store" to OPENAI_CHAT_COMPLETION_PARAMS list, aligning it with DEFAULT_CHAT_COMPLETION_PARAM_VALUES which already included store. This is a correct, minimal one-line fix.
tests/llm_translation/test_optional_params.py Adds 3 new tests verifying store is properly dropped for Anthropic (with drop_params and additional_drop_params) and is recognized in OPENAI_CHAT_COMPLETION_PARAMS. Also includes minor whitespace cleanup (trailing spaces removed).

Flowchart

flowchart TD
    A["Client sends store=true<br/>(e.g. OpenClaw via proxy)"] --> B["get_optional_params()"]
    B --> C["pre_process_non_default_params()<br/>checks DEFAULT_CHAT_COMPLETION_PARAM_VALUES"]
    C --> D{"Provider supports store?"}
    D -->|"OpenAI"| E["get_standard_openai_params()<br/>checks OPENAI_CHAT_COMPLETION_PARAMS"]
    E --> F{"store in OPENAI_CHAT_COMPLETION_PARAMS?"}
    F -->|"Before fix: No"| G["store silently dropped<br/>for OpenAI calls"]
    F -->|"After fix: Yes"| H["store forwarded<br/>to OpenAI API ✓"]
    D -->|"Anthropic"| I{"drop_params=True?"}
    I -->|"Yes"| J["store dropped ✓"]
    I -->|"No"| K["UnsupportedParamsError raised"]
Loading

Last reviewed commit: bc829f9

Copy link
Copy Markdown
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2 files reviewed, 1 comment

Edit Code Review Agent Settings | Greptile

@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps bot commented Feb 14, 2026

Additional Comments (1)

litellm/constants.py
Pre-existing: duplicate "temperature" entry

Not introduced by this PR, but "temperature" appears twice in OPENAI_CHAT_COMPLETION_PARAMS (lines 530-531). Since this is a list (not a set), the duplicate is harmless for in checks, but it's worth cleaning up.

    "temperature",

@ghost ghost changed the base branch from main to litellm_oss_staging_02_17_2026 February 17, 2026 04:25
@ghost ghost merged commit 4978df8 into BerriAI:litellm_oss_staging_02_17_2026 Feb 17, 2026
4 of 18 checks passed
This pull request was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: drop_params field doesn;t work

2 participants