fix: preserve metadata for custom callbacks on codex/responses path (…#21243
Conversation
…erriAI#21204) - Use metadata or litellm_metadata when calling update_environment_variables in responses/main.py so metadata is not overwritten by None on the bridge path (completion -> responses API). - Add tests for metadata in custom callback for codex models and for litellm_metadata in aresponses(). Co-authored-by: Cursor <cursoragent@cursor.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
Greptile SummaryFixes a regression where user-supplied
Confidence Score: 4/5
|
| Filename | Overview |
|---|---|
| litellm/responses/main.py | Adds fallback logic to preserve metadata from litellm_metadata kwargs when metadata parameter is None (codex bridge path). The fix is correct and well-targeted. |
| tests/test_litellm/responses/test_metadata_codex_callback.py | New test file with two async tests verifying metadata preservation for codex models and direct responses API calls. Tests use mocks (no real network calls) but don't clean up litellm.callbacks global state. |
Sequence Diagram
sequenceDiagram
participant User
participant acompletion as litellm.acompletion()
participant Bridge as responses_api_bridge
participant Responses as responses/main.py
participant Logging as LiteLLMLoggingObj
participant Callback as CustomLogger
User->>acompletion: metadata={"foo": "bar"}
acompletion->>Bridge: litellm_params={metadata: {"foo": "bar"}}
Bridge->>Bridge: _build_sanitized_litellm_params()
Note over Bridge: Merges metadata → litellm_metadata<br/>Strips metadata from params
Bridge->>Responses: responses(**request_data)<br/>litellm_metadata={"foo": "bar"}, metadata=None
Responses->>Responses: metadata_for_callbacks = metadata or kwargs["litellm_metadata"]
Note over Responses: FIX: Falls back to litellm_metadata<br/>when metadata is None
Responses->>Logging: update_environment_variables(metadata=metadata_for_callbacks)
Logging->>Callback: async_log_success_event(kwargs)<br/>kwargs["litellm_params"]["metadata"] = {"foo": "bar"}
Last reviewed commit: fb2ee17
|
@saneroen Can we expect the metadata in kwargs['standard_logging_object'] as well? |
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
since the slo is built in the logging object, i believe this should work |
dcff326
into
BerriAI:litellm_oss_staging_02_16_2026
Description
fix: preserve metadata for custom callbacks on codex/responses path (#21204)
Relevant issues
Fixes #21204
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/directory, Adding at least 1 test is a hard requirement - see detailsmake test-unit@greptileaiand received a Confidence Score of at least 4/5 before requesting a maintainer reviewCI (LiteLLM team)
Branch creation CI run
Link:
CI run for the last commit
Link:
Merge / cherry-pick CI run
Links:
Type
🐛 Bug Fix
Changes
metadata(orlitellm_metadata) when callingupdate_environment_variableson the responses API bridge path so it is not overwritten byNone.acompletion()and for directaresponses()withlitellm_metadata.