Skip to content

fix(proxy): preserve and forward OAuth Authorization headers through proxy layer#19912

Merged
1 commit merged intoBerriAI:litellm_oss_staging_02_17_2026from
iamadamreed:fix/anthropic-oauth-token-forwarding
Feb 17, 2026
Merged

fix(proxy): preserve and forward OAuth Authorization headers through proxy layer#19912
1 commit merged intoBerriAI:litellm_oss_staging_02_17_2026from
iamadamreed:fix/anthropic-oauth-token-forwarding

Conversation

@iamadamreed
Copy link
Contributor

@iamadamreed iamadamreed commented Jan 28, 2026

Summary

PR #21039 fixed OAuth token handling at the LLM layer (using Authorization: Bearer instead of x-api-key), but the proxy layer still strips the Authorization header in clean_headers() before it ever reaches the Anthropic code. This breaks OAuth for proxy users (e.g., Claude Code Max through LiteLLM proxy).

This PR is the complementary proxy-layer fix.

Relevant issues

Fixes #19618

Changes

File Change
litellm/llms/anthropic/common_utils.py Added is_anthropic_oauth_key() helper to detect OAuth token prefix (sk-ant-oat*)
litellm/proxy/litellm_pre_call_utils.py Modified clean_headers() to preserve Authorization header when it contains an Anthropic OAuth token
litellm/proxy/litellm_pre_call_utils.py Modified add_provider_specific_headers_to_request() to forward OAuth Authorization via ProviderSpecificHeader scoped to Anthropic-compatible providers

How it works

  1. clean_headers() normally strips the Authorization header (it's in _SPECIAL_HEADERS_CACHE). With this fix, it preserves it when the value contains an Anthropic OAuth token (sk-ant-oat*)
  2. add_provider_specific_headers_to_request() picks up the preserved OAuth Authorization header and wraps it in a ProviderSpecificHeader targeting only anthropic, bedrock, and vertex_ai providers
  3. This ensures OAuth tokens are forwarded through the proxy to Anthropic-compatible providers, but not leaked to other providers in the router

Relationship to #21039

Layer Fixed by
LLM layer (get_anthropic_headers, optionally_handle_anthropic_oauth, passthrough) #21039 (already merged)
Proxy layer (clean_headers, add_provider_specific_headers_to_request) This PR

Without this PR, #21039's fixes never take effect for proxy users because the Authorization header is stripped before reaching the Anthropic code.

Pre-Submission checklist

  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem
  • Added 10 new tests (5 for is_anthropic_oauth_key, 5 for proxy-layer forwarding)

Type

🐛 Bug Fix

Test plan

  • 24/24 tests pass in test_anthropic_common_utils.py
  • clean_headers preserves OAuth Authorization, strips non-OAuth
  • add_provider_specific_headers_to_request forwards OAuth only to Anthropic-compatible providers
  • is_anthropic_oauth_key handles raw tokens, Bearer format, None, empty string, regular keys, case sensitivity

@vercel
Copy link

vercel bot commented Jan 28, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
litellm Error Error Feb 14, 2026 5:40am

Request Review

@CLAassistant
Copy link

CLAassistant commented Jan 28, 2026

CLA assistant check
All committers have signed the CLA.

@iamadamreed
Copy link
Contributor Author

Improved OAuth Token Security Fix

I've updated this PR to use an _oauth_ marker approach instead of setting OAuth tokens as x-api-key.

The Problem with the Original Approach

The original implementation extracted OAuth tokens from the Authorization header and set them as x-api-key:

api_key = auth_header.replace("Bearer ", "")
headers["x-api-key"] = api_key

This causes two issues:

  1. Wrong Header Format: OAuth tokens from Claude Code Max belong in Authorization header, not x-api-key
  2. Token Leakage to Z.AI: Z.AI exposes an Anthropic-compatible API (api.zai.ai) that receives forwarded headers. Setting the OAuth token as x-api-key means it would be sent to Z.AI, which doesn't support Anthropic OAuth

The Solution

The update introduces an "_oauth_" marker pattern:

  1. optionally_handle_anthropic_oauth() returns api_key = "_oauth_" instead of the actual token
  2. transformation.py checks for this marker before setting x-api-key (and skips it if found)
  3. OAuth tokens stay in Authorization header only, and are only forwarded to Anthropic via the provider_specific_header mechanism

Key Changes

  • common_utils.py: Returns "api_key": "_oauth_" as marker for OAuth
  • transformation.py: Checks for _oauth_ marker before setting x-api-key
  • Tests: Updated to verify x-api-key is NOT set for OAuth requests

Security Benefit

This ensures Claude Code Max OAuth tokens are never set as x-api-key, preventing them from being sent to Z.AI's Anthropic-compatible endpoint which would reject them (or potentially log/leak them).

The existing add_provider_specific_headers_to_request() mechanism already correctly forwards the Authorization header specifically to Anthropic-compatible providers.

@jquinter
Copy link
Contributor

Thanks! Minor comments:

  1. The is_anthropic_oauth_key import is inside clean_headers() and add_provider_specific_headers_to_request(). This is fine for avoiding circular imports but could be moved to top-level if safe.
  2. Tests changed model from claude-3-haiku-20240307 to anthropic/claude-3-haiku-20240307. Make sure this is intentional and compatible (claude 3 haiku doesn't have the anthropic/ prefix.
  3. Missing test for is_anthropic_oauth_key() - The helper function is tested indirectly but no dedicated unit tests for edge cases (empty string, None, regular API key, etc.)

I noticed also that your PR only updates the experimental pass-through endpoint. What about the other endpoints (batches, completion, count_tokens, files, skills)? or are the covered by the router flow?

@iamadamreed
Copy link
Contributor Author

@jquinter Thanks for the thorough review! Here are my responses to each point:

1. Import location (is_anthropic_oauth_key inside functions)

I investigated this and confirmed it's safe to move to top-level - common_utils.py has no imports from litellm.proxy, so no circular import risk. However, I'm keeping the lazy imports for consistency with existing LiteLLM patterns throughout the codebase. This is a stylistic choice rather than a technical constraint.

2. Model naming (claude-3-haiku-20240307 vs anthropic/claude-3-haiku-20240307)

You're absolutely right - this was an unnecessary change. I've reverted the model names back to claude-3-haiku-20240307 (without the anthropic/ prefix).

The original PR used the non-prefixed form, and since these tests call internal Anthropic functions (validate_environment(), validate_anthropic_messages_environment()) that don't process model names for routing, the prefix isn't needed. The anthropic/ prefix is only used by the LiteLLM router for provider dispatch.

3. Missing tests for is_anthropic_oauth_key()

Done! ✅ Added test_is_anthropic_oauth_key_edge_cases() in commit 009006a816 covering:

  • None and empty string
  • Regular API keys vs OAuth tokens
  • Bearer format
  • Case sensitivity
  • Edge cases (just prefix)

4. Other endpoints (batches, completion, count_tokens, files, skills)

This is a valid observation. The current PR only updates the experimental_pass_through/messages/ endpoint. The other endpoints you mentioned do not currently have OAuth support:

Endpoint OAuth Support
chat/ ✅ Already has it (via inherited validate_environment())
experimental_pass_through/messages/ This PR
batches/ ❌ Not supported
completion/ ❌ Not supported
count_tokens/ ❌ Not supported
files/ ❌ Not supported
skills/ ❌ Not supported

That said, these endpoints aren't used by Claude Code Max, which is the primary use case for this OAuth fix. Claude Code Max exclusively uses the /v1/messages endpoint (experimental pass-through).

If OAuth support for these other endpoints becomes a requirement, that can be addressed in a follow-up PR using the same pattern (_oauth_ marker + conditional x-api-key setting).


Summary: Reverted the unnecessary model name prefix change. The tests and edge case tests are now complete. Other endpoints can get OAuth support when/if needed.

@kesensoy
Copy link

Hey @iamadamreed — just wanted to check in on this PR. I have a use case for OAuth token forwarding and we are using your proposed changes as a fix.

It looks like the branch has merge conflicts and the CLA is still pending, which is blocking CI from running fully. Are you still planning to update this? Happy to help if needed!

@iamadamreed
Copy link
Contributor Author

Hey @iamadamreed — just wanted to check in on this PR. I have a use case for OAuth token forwarding and we are using your proposed changes as a fix.

It looks like the branch has merge conflicts and the CLA is still pending, which is blocking CI from running fully. Are you still planning to update this? Happy to help if needed!

Yeah, I constantly forget to toggle between my personal and work account, and then do shit like this lol.

I am resolving this now.

…proxy layer

PR BerriAI#21039 fixed OAuth token handling at the LLM layer (Authorization: Bearer
instead of x-api-key), but the proxy layer still strips the Authorization
header in clean_headers() before it reaches the Anthropic code. This breaks
OAuth for proxy users (e.g., Claude Code Max through LiteLLM proxy).

Changes:
- Add is_anthropic_oauth_key() helper to detect OAuth tokens (sk-ant-oat*)
- Preserve OAuth Authorization headers in clean_headers() instead of stripping
- Forward OAuth Authorization via ProviderSpecificHeader in
  add_provider_specific_headers_to_request() so tokens only reach
  Anthropic-compatible providers (anthropic, bedrock, vertex_ai)

Fixes BerriAI#19618
@iamadamreed iamadamreed force-pushed the fix/anthropic-oauth-token-forwarding branch from 5c739b9 to 19ee11c Compare February 14, 2026 05:38
@iamadamreed iamadamreed changed the title fix(anthropic): properly forward OAuth tokens (sk-ant-oat*) to Anthropic-compatible providers fix(proxy): preserve and forward OAuth Authorization headers through proxy layer Feb 14, 2026
@iamadamreed
Copy link
Contributor Author

PR Reworked — Rebased on main, scoped to proxy layer only

@kesensoy @jquinter This PR has been fully reworked. Here's what changed:

What happened

While this PR was waiting on review, PR #21039 landed on main and fixed OAuth token handling at the LLM layeroptionally_handle_anthropic_oauth(), get_anthropic_headers(), and the passthrough messages endpoint. That fix is more thorough than what this PR originally had (it handles OAuth tokens arriving via both the Authorization header and the api_key parameter).

However, #21039 did not fix the proxy layer. The proxy's clean_headers() still strips the Authorization header before it ever reaches the Anthropic code, which means OAuth is still broken for proxy users.

What this PR now does

Rebased on main (single clean commit, no conflicts), scoped to the gap #21039 left:

  1. is_anthropic_oauth_key() — new helper to detect sk-ant-oat* tokens
  2. clean_headers() — preserves Authorization header when it contains an Anthropic OAuth token (normally stripped)
  3. add_provider_specific_headers_to_request() — forwards OAuth Authorization via ProviderSpecificHeader scoped to anthropic,bedrock,vertex_ai only

@jquinter — re: your earlier review points

All 4 points are now moot since the _oauth_ marker approach was dropped entirely:

  1. ✅ Lazy imports kept for consistency (same pattern as before)
  2. ✅ Model name revert — no longer applicable (those test changes were dropped)
  3. is_anthropic_oauth_key() now has dedicated edge case tests
  4. ✅ Other endpoints — covered by fix(anthropic): use Authorization Bearer for OAuth tokens instead of x-api-key #21039 via the main validate_environment() path

CI note

The test (*) matrix jobs fail with pyproject.toml changed significantly since poetry.lock was last generated — this is a pre-existing issue that also affects main's HEAD. The unit-test check passes ✅.

@ygalvao
Copy link

ygalvao commented Feb 16, 2026

This is needed, PR #21039 was indeed incomplete: it missed the proxy layer.

@jquinter
Copy link
Contributor

@greptile-apps review this PR

@greptile-apps
Copy link
Contributor

greptile-apps bot commented Feb 16, 2026

Greptile Summary

This PR fixes Anthropic OAuth token (sk-ant-oat*) forwarding through the LiteLLM proxy layer, complementing the LLM-layer fix in #21039. Previously, clean_headers() stripped the Authorization header before it could reach Anthropic-specific code, breaking OAuth for proxy users (e.g., Claude Code Max).

  • Adds is_anthropic_oauth_key() helper in litellm/llms/anthropic/common_utils.py to detect OAuth tokens in raw or Bearer-prefixed format
  • Modifies clean_headers() to preserve the Authorization header when it contains an Anthropic OAuth token
  • Modifies add_provider_specific_headers_to_request() to forward preserved OAuth tokens via ProviderSpecificHeader scoped only to anthropic, bedrock, and vertex_ai — preventing token leakage to other providers
  • The OAuth token does NOT leak through _get_forwardable_headers(), which only forwards x-* and anthropic-beta headers
  • Includes 10 new unit tests with no network calls
  • Note: This extends the existing pattern of Anthropic-specific code in the proxy layer (ANTHROPIC_API_HEADERS handling), which is flagged by the project's style guide recommending provider-specific code stay within llms/

Confidence Score: 4/5

  • This PR is safe to merge — the OAuth token is properly scoped to Anthropic-compatible providers and won't leak to other providers in the router.
  • Score of 4 reflects clean, focused changes with good test coverage. The token forwarding is properly scoped via ProviderSpecificHeader. Deducted 1 point for introducing additional provider-specific logic in the proxy layer (contrary to project style guide), though this extends an existing pattern. No security, logic, or runtime issues found.
  • litellm/proxy/litellm_pre_call_utils.py — contains the core behavioral change; worth verifying that the preserved authorization header doesn't interact unexpectedly with custom litellm_key_header_name configurations.

Important Files Changed

Filename Overview
litellm/llms/anthropic/common_utils.py Adds is_anthropic_oauth_key() helper function to detect OAuth token prefix (sk-ant-oat*), handling both raw and Bearer-prefixed formats. Minor PEP 8 blank line issue.
litellm/proxy/litellm_pre_call_utils.py Modifies clean_headers() to preserve Authorization when it contains Anthropic OAuth token, and add_provider_specific_headers_to_request() to forward it via ProviderSpecificHeader scoped to Anthropic-compatible providers. Introduces provider-specific logic in the proxy layer.
tests/test_litellm/llms/anthropic/test_anthropic_common_utils.py Adds 10 well-structured unit tests covering is_anthropic_oauth_key, clean_headers OAuth preservation, and add_provider_specific_headers_to_request forwarding. All tests are mocked with no network calls.

Sequence Diagram

sequenceDiagram
    participant Client
    participant Proxy as LiteLLM Proxy
    participant Clean as clean_headers
    participant PSH as add_provider_specific_headers
    participant Main as litellm.completion
    participant API as Anthropic API

    Client->>Proxy: Request with OAuth Authorization header
    Proxy->>Proxy: user_api_key_auth authenticates request
    Proxy->>Clean: clean_headers(request.headers)
    Note over Clean: is_anthropic_oauth_key detects<br/>OAuth prefix, preserves header
    Clean-->>Proxy: headers with authorization kept
    Proxy->>PSH: add_provider_specific_headers(data, headers)
    Note over PSH: Wraps OAuth auth in<br/>ProviderSpecificHeader scoped to<br/>anthropic, bedrock, vertex_ai
    PSH-->>Proxy: data with provider_specific_header
    Proxy->>Main: litellm.completion(data)
    Note over Main: ProviderSpecificHeaderUtils<br/>matches provider, merges headers
    Main->>API: Forwards OAuth auth header
Loading

Last reviewed commit: 19ee11c

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

3 files reviewed, 2 comments

Edit Code Review Agent Settings | Greptile

"""
Removes litellm api key from headers
"""
from litellm.llms.anthropic.common_utils import is_anthropic_oauth_key
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Provider-specific import in proxy layer

This introduces Anthropic-specific logic (is_anthropic_oauth_key) into the proxy's clean_headers() function, which is a general-purpose proxy utility. The project's custom instructions recommend avoiding provider-specific code outside of the llms/ directory.

That said, I see that add_provider_specific_headers_to_request() already contains Anthropic-specific header handling (the ANTHROPIC_API_HEADERS loop), so this PR extends an existing pattern rather than introducing a new one. If the maintainers want to address this in the future, consider extracting a more generic "provider pass-through credential" abstraction that could work for any provider, rather than hardcoding Anthropic OAuth detection in the proxy layer.

Context Used: Rule from dashboard - What: Avoid writing provider-specific code outside of the llms/ directory.

Why: This practice ensur... (source)

Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!

Comment on lines +32 to 34
return value.startswith(ANTHROPIC_OAUTH_TOKEN_PREFIX)

def optionally_handle_anthropic_oauth(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing blank line between functions

PEP 8 requires two blank lines between top-level function definitions. There's only one blank line separating is_anthropic_oauth_key from optionally_handle_anthropic_oauth.

Suggested change
return value.startswith(ANTHROPIC_OAUTH_TOKEN_PREFIX)
def optionally_handle_anthropic_oauth(
return value.startswith(ANTHROPIC_OAUTH_TOKEN_PREFIX)
def optionally_handle_anthropic_oauth(

Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!

@ghost ghost changed the base branch from main to litellm_oss_staging_02_17_2026 February 17, 2026 04:08
@ghost ghost merged commit 8d50956 into BerriAI:litellm_oss_staging_02_17_2026 Feb 17, 2026
4 of 17 checks passed
zepfu added a commit to zepfu/litellm that referenced this pull request Mar 6, 2026
- Remove aawm.1 entry (absorbed by upstream PR BerriAI#19912 in v1.81.13)
- Update base version references from v1.81.13 to upstream/main (v1.81.14)
- Add aawm.3 (header propagation) and aawm.4 (agent identity) entries
- Add aawm.5 (Gemini OAuth) entry with updated description reflecting
  that only ya29.* is new; Anthropic OAuth now handled upstream
- Add "Dropped Patches" section documenting aawm.1 absorption
- Update patch count to 4 active patches (aawm.2, aawm.3, aawm.4, aawm.5)
- Update Dockerfile overlay example to include all 6 patched files
- Update branch strategy table to include aawm/patches-rebase
zepfu added a commit to zepfu/litellm that referenced this pull request Mar 21, 2026
- Remove aawm.1 entry (absorbed by upstream PR BerriAI#19912 in v1.81.13)
- Update base version references from v1.81.13 to upstream/main (v1.81.14)
- Add aawm.3 (header propagation) and aawm.4 (agent identity) entries
- Add aawm.5 (Gemini OAuth) entry with updated description reflecting
  that only ya29.* is new; Anthropic OAuth now handled upstream
- Add "Dropped Patches" section documenting aawm.1 absorption
- Update patch count to 4 active patches (aawm.2, aawm.3, aawm.4, aawm.5)
- Update Dockerfile overlay example to include all 6 patched files
- Update branch strategy table to include aawm/patches-rebase
This pull request was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug] OAuth tokens not forwarded to Anthropic - PR #19453 incomplete

5 participants