-
Notifications
You must be signed in to change notification settings - Fork 2.6k
docs: claude prompt caching note #6429
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| | [xAI](https://x.ai/) | Access to xAI's Grok models including grok-3, grok-3-mini, and grok-3-fast with 131,072 token context window. | `XAI_API_KEY`, `XAI_HOST` (optional) | | ||
|
|
||
| :::tip Prompt Caching for Claude Models | ||
| goose automatically enables Anthropic's [prompt caching](https://platform.claude.com/docs/en/build-with-claude/prompt-caching) when using Claude models via Anthropic, Databricks, OpenRouter, and LiteLLM providers. This adds `cache_control` markers to requests, which can reduce costs for longer conversations by caching frequently-used context. See the [provider implementations](https://github.com/block/goose/tree/main/crates/goose/src/providers) for technical details. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
linked to providers directory to help future proof the list of providers
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR documents prompt caching support for Claude models across multiple providers, improving user awareness of this cost-saving feature.
- Adds a tip section explaining that goose automatically enables Anthropic's prompt caching for Claude models
- Clarifies which providers support this feature (Anthropic, Databricks, OpenRouter, and LiteLLM)
- Provides links to Anthropic's documentation and the implementation code
|
…ased * 'main' of github.com:block/goose: chore: break up process agent response (#6348) More 3.7 removal (#6414) CLI show extension errors (#6398) fix[desktop]: Improve UX for ExtensionItem component (#6443) update[doc]: Add tip for GitHub Copilot Provider (#6441) Avoid using cliclack.confirm in non-interactive session (#6412) docs: claude prompt caching note (#6429) Restore task completion notification (#6427) docs: stream-json and auth-token cli options (#6426)
Summary
This PR documents prompt caching which is currently supported by Anthropic, Databricks, OpenRouter, and LiteLLM providers for Claude models
Documentation updates:
documentation/docs/getting-started/providers.md:Type of Change
AI Assistance
Testing
none