Skip to content

Conversation

konekohana
Copy link
Contributor

@konekohana konekohana commented Oct 14, 2025

Prompt caching for anthropic models with openrouter

Openrouter supports prompt caching for anthropic models as described here. This PR adds supports_prompt_caching and cache write/read prices for relevant models into both model price map files.

There isn't any code change in this PR, hence no added tests.

Relevant issues

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🆕 New Feature

Changes

Copy link

vercel bot commented Oct 14, 2025

@konekohana is attempting to deploy a commit to the CLERKIEAI Team on Vercel.

A member of the Team first needs to authorize it.

@krrishdholakia krrishdholakia merged commit 71b9bec into BerriAI:main Oct 16, 2025
4 of 6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants