What happened?
proxy/utils.py line 3745:
# Get Anthropic API key from deployment config
anthropic_api_key = None
if deployment is not None:
anthropic_api_key = deployment.get("litellm_params", {}).get("api_key")
Only the API key is used, and a customized api base is ignored - which could very well be e.g. another chained LiteLLM, or any other kind of a proxy.
I have started a Codex Web task if it succeeds (with test), a PR will follow
Relevant log output
llmops_litellm | 03:34:15 - LiteLLM Proxy:WARNING: utils.py:3778 - Error calling Anthropic API: Error code: 403 - {'error': {'type': 'forbidden', 'message': 'Request not allowed'}}, falling back to LiteLLM tokenizer
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.77.5-stable
Twitter / LinkedIn details
No response