Skip to content

[Bug]: Anthropic /messages/count_tokens endpoint does not respect customized api_base URL #15473

@DmitriyAlergant

Description

@DmitriyAlergant

What happened?

proxy/utils.py line 3745:

        # Get Anthropic API key from deployment config
        anthropic_api_key = None
        if deployment is not None:
            anthropic_api_key = deployment.get("litellm_params", {}).get("api_key")

Only the API key is used, and a customized api base is ignored - which could very well be e.g. another chained LiteLLM, or any other kind of a proxy.

I have started a Codex Web task if it succeeds (with test), a PR will follow

Relevant log output

llmops_litellm  | 03:34:15 - LiteLLM Proxy:WARNING: utils.py:3778 - Error calling Anthropic API: Error code: 403 - {'error': {'type': 'forbidden', 'message': 'Request not allowed'}}, falling back to LiteLLM tokenizer

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.77.5-stable

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions