fix: add store to OPENAI_CHAT_COMPLETION_PARAMS#21195
Conversation
The OpenAI `store` parameter (used for storing completions for
distillation/evals) was missing from `OPENAI_CHAT_COMPLETION_PARAMS`.
This caused it to be unrecognized by `get_standard_openai_params()` and
the `litellm_proxy` provider config. It also meant that code paths using
this list (rather than `DEFAULT_CHAT_COMPLETION_PARAM_VALUES`) would
treat `store` as a provider-specific parameter and forward it to
non-OpenAI providers like Anthropic, resulting in:
"store: Extra inputs are not permitted"
Fixes BerriAI#19700
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
Greptile OverviewGreptile SummaryAdds
Fixes #19700 Confidence Score: 5/5
|
| Filename | Overview |
|---|---|
| litellm/constants.py | Adds "store" to OPENAI_CHAT_COMPLETION_PARAMS list, aligning it with DEFAULT_CHAT_COMPLETION_PARAM_VALUES which already included store. This is a correct, minimal one-line fix. |
| tests/llm_translation/test_optional_params.py | Adds 3 new tests verifying store is properly dropped for Anthropic (with drop_params and additional_drop_params) and is recognized in OPENAI_CHAT_COMPLETION_PARAMS. Also includes minor whitespace cleanup (trailing spaces removed). |
Flowchart
flowchart TD
A["Client sends store=true<br/>(e.g. OpenClaw via proxy)"] --> B["get_optional_params()"]
B --> C["pre_process_non_default_params()<br/>checks DEFAULT_CHAT_COMPLETION_PARAM_VALUES"]
C --> D{"Provider supports store?"}
D -->|"OpenAI"| E["get_standard_openai_params()<br/>checks OPENAI_CHAT_COMPLETION_PARAMS"]
E --> F{"store in OPENAI_CHAT_COMPLETION_PARAMS?"}
F -->|"Before fix: No"| G["store silently dropped<br/>for OpenAI calls"]
F -->|"After fix: Yes"| H["store forwarded<br/>to OpenAI API ✓"]
D -->|"Anthropic"| I{"drop_params=True?"}
I -->|"Yes"| J["store dropped ✓"]
I -->|"No"| K["UnsupportedParamsError raised"]
Last reviewed commit: bc829f9
Additional Comments (1)
Not introduced by this PR, but |
4978df8
into
BerriAI:litellm_oss_staging_02_17_2026
Summary
The OpenAI
storeparameter (docs) was missing fromOPENAI_CHAT_COMPLETION_PARAMSinlitellm/constants.py.This caused
storeto be unrecognized by helper functions that rely on this list (get_standard_openai_params(),litellm_proxyprovider config). More critically, code paths usingOPENAI_CHAT_COMPLETION_PARAMSinstead ofDEFAULT_CHAT_COMPLETION_PARAM_VALUESwould treatstoreas a provider-specific parameter and forward it to non-OpenAI providers like Anthropic, causing:Changes
litellm/constants.py: Added"store"toOPENAI_CHAT_COMPLETION_PARAMS(it was already inDEFAULT_CHAT_COMPLETION_PARAM_VALUES)tests/llm_translation/test_optional_params.py: Added 3 tests:test_drop_store_param_for_anthropic— verifiesdrop_params=Truestripsstorefor Anthropictest_additional_drop_params_store_for_anthropic— verifiesadditional_drop_params=["store"]workstest_store_in_openai_chat_completion_params— verifiesstoreis in the params list and recognized byget_standard_openai_params()Context
Discovered while running OpenClaw through a LiteLLM proxy to Anthropic. OpenClaw sends
store=truein its OpenAI-compatible requests, which leaked through to Anthropic and was rejected.Fixes #19700