Skip to content

Fix: remove unsupported prompt-caching-scope-2026-01-05 header for vertex ai#20058

Merged
Sameerlite merged 1 commit intomainfrom
litellm_vertex_ai_prompt-caching-scope-2026-01-05,
Jan 30, 2026
Merged

Fix: remove unsupported prompt-caching-scope-2026-01-05 header for vertex ai#20058
Sameerlite merged 1 commit intomainfrom
litellm_vertex_ai_prompt-caching-scope-2026-01-05,

Conversation

@Sameerlite
Copy link
Collaborator

Relevant issues

Fixes #19984

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

CI (LiteLLM team)

CI status guideline:

  • 50-55 passing tests: main is stable with minor issues.
  • 45-49 passing tests: acceptable but needs attention
  • <= 40 passing tests: unstable; be careful with your merges and assess the risk.
  • Branch creation CI run
    Link:

  • CI run for the last commit
    Link:

  • Merge / cherry-pick CI run
    Links:

Type

🆕 New Feature
🐛 Bug Fix
🧹 Refactoring
📖 Documentation
🚄 Infrastructure
✅ Test

Changes

image image

@vercel
Copy link

vercel bot commented Jan 30, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
litellm Error Error Jan 30, 2026 0:06am

Request Review

@Sameerlite Sameerlite merged commit 6e92103 into main Jan 30, 2026
25 of 65 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: VertexAI Anthropic passthrough fails with prompt-caching-scope-2026-01-05 beta header

1 participant