feat: Enable proxy to llama.cpp for Anthropic Messages API support.#7395
Merged
feat: Enable proxy to llama.cpp for Anthropic Messages API support.#7395
Conversation
Contributor
There was a problem hiding this comment.
Pull request overview
This PR extends the local API server proxy so it can forward Anthropic-compatible /v1/messages and /v1/messages/count_tokens calls to llama.cpp, and adds some initial tests around the proxy configuration and path handling.
Changes:
- Exposes
ProxyConfigandget_destination_pathfromproxy.rsand adjusts the upstream URL construction to always call the llama.cpp server under/v1{destination_path}. - Extends the dynamic routing in
proxy_requestto treat/messagesand/messages/count_tokenslike the existing/chat/completions,/completions, and/embeddingsendpoints, including model lookup in the request body. - Introduces a new
src-tauri/src/core/server/tests.rstest module and wires it intoserver/mod.rsto coverget_destination_path, basicProxyConfigconstruction, and some whitelisting/method/header expectations.
Reviewed changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 6 comments.
| File | Description |
|---|---|
src-tauri/src/core/server/proxy.rs |
Makes ProxyConfig and get_destination_path public, adds /messages and /messages/count_tokens handling to the routing match, bumps some log levels, and changes the upstream URL to include a hard-coded /v1 prefix. |
src-tauri/src/core/server/tests.rs |
New test module with unit tests for get_destination_path, ProxyConfig field values, and some locally defined whitelists/method/header lists intended to mirror proxy behavior. |
src-tauri/src/core/server/mod.rs |
Exposes the new server tests module under #[cfg(test)] so the new tests are compiled and run. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Contributor
Barecheck - Code coverage reportTotal: 26.23%Your code coverage diff: 0.00% ▴ ✅ All code changes are covered |
Minh141120
approved these changes
Jan 24, 2026
Minh141120
approved these changes
Jan 24, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Describe Your Changes
This PR is to update the Local API Server proxy to enable Llama.cpp Anthropic Messages API support. Thanks to Llama.cpp
with this PR support ggml-org/llama.cpp#17570.
Anthropic Messages API Reference: https://platform.claude.com/docs/en/api/messages
Small bug fix, previously it does not fully OAI compatible for completion endpoint since it forwards to /completions not v1/completions.
Note: Anthropic API Key header is not supported yet through this PR, so there is a minor tweak needed to have bearer token in the env setting (anthropics/claude-code#1859)
Example env.
Screenshots
Action Items:
Fixes Issues
Self Checklist