Skip to content

fix(litellm): parse nested usage payloads#8081

Merged
DOsinga merged 6 commits into
aaif-goose:mainfrom
jamestotah:fix/litellm-usage-parsing
Mar 25, 2026
Merged

fix(litellm): parse nested usage payloads#8081
DOsinga merged 6 commits into
aaif-goose:mainfrom
jamestotah:fix/litellm-usage-parsing

Conversation

@jamestotah
Copy link
Copy Markdown
Contributor

@jamestotah jamestotah commented Mar 24, 2026

Summary

  • parse OpenAI-compatible usage data from either the top-level payload or a nested usage object
  • preserve explicit LiteLLM cache read and cache creation token fields without changing provider-reported prompt or total token counts
  • add focused regression tests for nested usage payloads and cache field parsing

Testing

  • cargo test -p goose test_get_usage_

Signed-off-by: James Totah <135163520+jamestotah@users.noreply.github.com>
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: a7de7abc9a

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment thread crates/goose/src/providers/formats/openai.rs Outdated
Signed-off-by: James Totah <135163520+jamestotah@users.noreply.github.com>
@jamestotah jamestotah changed the title fix(litellm): parse cached usage fields fix(litellm): parse nested usage payloads Mar 24, 2026
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 7e39bd6ccd

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment thread crates/goose/src/providers/formats/openai.rs Outdated
jamestotah and others added 4 commits March 23, 2026 23:11
Signed-off-by: James Totah <135163520+jamestotah@users.noreply.github.com>
Signed-off-by: James Totah <135163520+jamestotah@users.noreply.github.com>
The cache-field fallback for input_tokens when prompt_tokens is absent
was dead code — all OpenAI-compatible APIs include prompt_tokens when
cache fields are present. Remove the branch and inline the variable.

Signed-off-by: Douwe Osinga <douwe@squareup.com>
Copy link
Copy Markdown
Collaborator

@DOsinga DOsinga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The core fix is solid: unwrapping the nested "usage" key in get_usage() correctly handles the litellm provider passing the full response body, and adding cache_read_input_tokens/cache_write_input_tokens to Usage is a clean forward-looking addition. Both Codex P1 comments were properly addressed by the author.

I merged main in and simplified get_usage slightly: the cache-field fallback for input_tokens when prompt_tokens is absent was dead code (all OpenAI-compatible APIs include prompt_tokens when cache fields are present), so I removed it and inlined the variable.

@DOsinga DOsinga added this pull request to the merge queue Mar 25, 2026
Merged via the queue into aaif-goose:main with commit a47add0 Mar 25, 2026
21 checks passed
vincenzopalazzo pushed a commit to vincenzopalazzo/goose that referenced this pull request Mar 26, 2026
Signed-off-by: James Totah <135163520+jamestotah@users.noreply.github.com>
Signed-off-by: Douwe Osinga <douwe@squareup.com>
Co-authored-by: Douwe Osinga <douwe@squareup.com>
Signed-off-by: Vincenzo Palazzo <vincenzopalazzodev@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants