Skip to content

feat: Enable proxy to llama.cpp for Anthropic Messages API support.#7395

Merged
louis-jan merged 1 commit intomainfrom
feat/enable-claude-messages-endpoint
Jan 24, 2026
Merged

feat: Enable proxy to llama.cpp for Anthropic Messages API support.#7395
louis-jan merged 1 commit intomainfrom
feat/enable-claude-messages-endpoint

Conversation

@louis-jan
Copy link
Contributor

@louis-jan louis-jan commented Jan 24, 2026

Describe Your Changes

This PR is to update the Local API Server proxy to enable Llama.cpp Anthropic Messages API support. Thanks to Llama.cpp
with this PR support ggml-org/llama.cpp#17570.

Anthropic Messages API Reference: https://platform.claude.com/docs/en/api/messages

Small bug fix, previously it does not fully OAI compatible for completion endpoint since it forwards to /completions not v1/completions.

Note: Anthropic API Key header is not supported yet through this PR, so there is a minor tweak needed to have bearer token in the env setting (anthropics/claude-code#1859)
Example env.

export ANTHROPIC_BASE_URL=http://127.0.0.1:1337
export ANTHROPIC_AUTH_TOKEN="Authorization: Bearer 1234"
// This header will be skipped in the next update since it should send x-api-key header
export ANTHROPIC_CUSTOM_HEADERS="Authorization: Bearer 1234"
export ANTHROPIC_DEFAULT_SONNET_MODEL=janhq/Jan-v3-4b-base-instruct-Q4_K_XL
export ANTHROPIC_DEFAULT_OPUS_MODEL=janhq/Jan-v3-4b-base-instruct-Q4_K_XL
export ANTHROPIC_DEFAULT_HAIKU_MODEL=janhq/Jan-v3-4b-base-instruct-Q4_K_XL

Screenshots

Screenshot 2026-01-24 at 21 15 55 Screenshot 2026-01-24 at 21 10 49

Action Items:

  • Update docs

Fixes Issues

  • Closes #
  • Closes #

Self Checklist

  • Added relevant comments, esp in complex areas
  • Updated docs (for bug fixes / features)
  • Created issues for follow-up changes or refactoring needed

Copilot AI review requested due to automatic review settings January 24, 2026 14:19
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR extends the local API server proxy so it can forward Anthropic-compatible /v1/messages and /v1/messages/count_tokens calls to llama.cpp, and adds some initial tests around the proxy configuration and path handling.

Changes:

  • Exposes ProxyConfig and get_destination_path from proxy.rs and adjusts the upstream URL construction to always call the llama.cpp server under /v1{destination_path}.
  • Extends the dynamic routing in proxy_request to treat /messages and /messages/count_tokens like the existing /chat/completions, /completions, and /embeddings endpoints, including model lookup in the request body.
  • Introduces a new src-tauri/src/core/server/tests.rs test module and wires it into server/mod.rs to cover get_destination_path, basic ProxyConfig construction, and some whitelisting/method/header expectations.

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 6 comments.

File Description
src-tauri/src/core/server/proxy.rs Makes ProxyConfig and get_destination_path public, adds /messages and /messages/count_tokens handling to the routing match, bumps some log levels, and changes the upstream URL to include a hard-coded /v1 prefix.
src-tauri/src/core/server/tests.rs New test module with unit tests for get_destination_path, ProxyConfig field values, and some locally defined whitelists/method/header lists intended to mirror proxy behavior.
src-tauri/src/core/server/mod.rs Exposes the new server tests module under #[cfg(test)] so the new tests are compiled and run.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@github-actions
Copy link
Contributor

Barecheck - Code coverage report

Total: 26.23%

Your code coverage diff: 0.00% ▴

✅ All code changes are covered

@louis-jan louis-jan changed the title feat: enable proxy to llama.cpp Anthropic Messages API support feat: Enable proxy to llama.cpp for Anthropic Messages API support. Jan 24, 2026
@louis-jan louis-jan merged commit 09df943 into main Jan 24, 2026
24 checks passed
@louis-jan louis-jan deleted the feat/enable-claude-messages-endpoint branch January 24, 2026 17:12
@github-project-automation github-project-automation bot moved this to QA in Jan Jan 24, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: QA

Development

Successfully merging this pull request may close these issues.

3 participants