Skip to content

feat: add /completion endpoint#275

Merged
mostlygeek merged 3 commits intomostlygeek:mainfrom
Yandrik:completion-endpoint
Aug 29, 2025
Merged

feat: add /completion endpoint#275
mostlygeek merged 3 commits intomostlygeek:mainfrom
Yandrik:completion-endpoint

Conversation

@Yandrik
Copy link
Contributor

@Yandrik Yandrik commented Aug 28, 2025

This pull request adds support for the llama-server /completion endpoint, for e.g. multimodal transcription via Voxtral or other custom chat template purposes.

Updates

llama-server /completion endpoint:

  • Updated README.md to document the /completion endpoint as a supported llama-server API.
  • Added a /completion POST handler to misc/simple-responder/simple-responder.go for compatibility with llama-server, returning a sample response.

Proxy manager integration:

  • Registered the /completion endpoint in proxy/proxymanager.go

Testing:

  • Added a test in proxy/proxymanager_test.go to verify that the /completion endpoint correctly proxies requests and returns expected results.

Summary by CodeRabbit

  • New Features

    • Added POST /completion endpoint compatible with llama-server, available alongside existing completion routes. Returns response text and usage metrics (prompt, completion, total tokens).
  • Documentation

    • README updated to list the /completion endpoint under supported endpoints.
  • Tests

    • Added tests to verify /completion requests are proxied and respond as expected.

@coderabbitai
Copy link

coderabbitai bot commented Aug 28, 2025

Walkthrough

Adds a POST /completion endpoint to the proxy and simple responder, updates README to document the endpoint, and adds a test verifying the proxy forwards /completion requests and returns expected responses.

Changes

Cohort / File(s) Summary of changes
Docs
README.md
Documented llama-server supported endpoint: added /completion.
Simple Responder
misc/simple-responder/simple-responder.go
Added POST /completion handler returning JSON with responseMessage and usage (completion_tokens, prompt_tokens, total_tokens).
Proxy Routes & Tests
proxy/proxymanager.go, proxy/proxymanager_test.go
Registered POST /completion in Gin with existing metrics middleware and proxyOAIHandler; added TestProxyManager_CompletionEndpoint to assert proxying and response content.
Tests (formatting only)
event/default_test.go, event/event_test.go
Whitespace/formatting adjustments only; no logic changes.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  participant C as Client
  participant P as ProxyManager (Gin)
  participant H as proxyOAIHandler
  participant R as Responder/Upstream

  Note over C,P: llama-server compatible route
  C->>P: POST /completion { model, ... }
  P->>H: mm middleware → proxyOAIHandler
  H->>R: Forward completion request
  R-->>H: JSON { response, usage, ... }
  H-->>P: Response passthrough
  P-->>C: 200 OK + JSON

  rect rgb(230,245,255)
  Note over P,H: New route wired to existing OAI proxy flow
  end
Loading
sequenceDiagram
  autonumber
  participant C as Client
  participant S as Simple Responder

  C->>S: POST /completion
  S-->>C: 200 OK { responseMessage, usage }
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Possibly related PRs

Suggested reviewers

  • mostlygeek

Tip

🔌 Remote MCP (Model Context Protocol) integration is now available!

Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats.


📜 Recent review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between dc7461b and a3855fc.

📒 Files selected for processing (3)
  • event/default_test.go (1 hunks)
  • event/event_test.go (1 hunks)
  • proxy/proxymanager_test.go (1 hunks)
✅ Files skipped from review due to trivial changes (2)
  • event/event_test.go
  • event/default_test.go
🚧 Files skipped from review as they are similar to previous changes (1)
  • proxy/proxymanager_test.go
✨ Finishing Touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbit in a new review comment at the desired location with your query.
  • PR comments: Tag @coderabbit in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbit gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbit read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR/Issue comments)

Type @coderabbit help to get the list of available commands.

Other keywords and placeholders

  • Add @coderabbit ignore or @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbit summary or @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbit or @coderabbitai anywhere in the PR title to generate the title automatically.

Status, Documentation and Community

  • Visit our Status Page to check the current availability of CodeRabbit.
  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (2)
README.md (1)

26-26: Clarify model requirement for llama-swap’s /completion.

Unlike llama.cpp, llama-swap requires a "model" field in the request body to route the call. Document this to prevent 400s from clients that omit it.

-  - `/completion` - for completion endpoint
+  - `/completion` - for completion endpoint (requires {"model": "<model_id>"} in the JSON body when used via llama-swap)
misc/simple-responder/simple-responder.go (1)

156-168: Add streaming support to /completion for parity with llama.cpp behavior.

Optional but useful for clients sending ?stream=true; mirrors your existing chat streaming path.

-// llama-server compatibility: /completion
-r.POST("/completion", func(c *gin.Context) {
-    c.Header("Content-Type", "application/json")
-    c.JSON(http.StatusOK, gin.H{
-        "responseMessage": *responseMessage,
-        "usage": gin.H{
-            "completion_tokens": 10,
-            "prompt_tokens":     25,
-            "total_tokens":      35,
-        },
-    })
-})
+// llama-server compatibility: /completion
+r.POST("/completion", func(c *gin.Context) {
+    // Support optional streaming like /v1/chat/completions
+    if c.Query("stream") == "true" {
+        c.Header("Content-Type", "text/event-stream")
+        c.Header("Cache-Control", "no-cache")
+        c.Header("Connection", "keep-alive")
+        c.Header("Transfer-Encoding", "chunked")
+
+        // optional wait to simulate slower responses
+        if wait, err := time.ParseDuration(c.Query("wait")); err == nil {
+            time.Sleep(wait)
+        }
+        for i := 0; i < 10; i++ {
+            c.SSEvent("message", gin.H{
+                "created": time.Now().Unix(),
+                "choices": []gin.H{{"index": 0, "text": "asdf", "finish_reason": nil}},
+            })
+            c.Writer.Flush()
+        }
+        c.SSEvent("message", gin.H{
+            "usage": gin.H{"completion_tokens": 10, "prompt_tokens": 25, "total_tokens": 35},
+            "timings": gin.H{"prompt_n": 25, "prompt_ms": 13, "predicted_n": 10, "predicted_ms": 17, "predicted_per_second": 10},
+        })
+        c.Writer.Flush()
+        c.SSEvent("message", "[DONE]")
+        c.Writer.Flush()
+        return
+    }
+
+    c.Header("Content-Type", "application/json")
+    c.JSON(http.StatusOK, gin.H{
+        "responseMessage": *responseMessage,
+        "usage": gin.H{
+            "completion_tokens": 10,
+            "prompt_tokens":     25,
+            "total_tokens":      35,
+        },
+    })
+})
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 57803fd and dc7461b.

📒 Files selected for processing (4)
  • README.md (1 hunks)
  • misc/simple-responder/simple-responder.go (1 hunks)
  • proxy/proxymanager.go (1 hunks)
  • proxy/proxymanager_test.go (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
proxy/proxymanager_test.go (3)
proxy/config.go (3)
  • AddDefaultGroupToConfig (364-406)
  • Config (149-169)
  • ModelConfig (20-40)
proxy/proxymanager.go (1)
  • New (47-130)
proxy/process.go (1)
  • StopWaitForInflightRequest (36-36)
🪛 GitHub Actions: Linux CI
proxy/proxymanager_test.go

[error] 1-1: gofmt formatting check failed for proxy/proxymanager_test.go. Command that triggered the failure: gofmt -l . | grep -v 'event/.*_test.go' | wc -l. Run 'gofmt -w .' to fix formatting.

🔇 Additional comments (1)
proxy/proxymanager.go (1)

206-208: Route registration looks correct and consistent.

Good: uses MetricsMiddleware and the same proxy handler as other llama-server endpoints, ensuring filtering and model rewriting apply uniformly.

Copy link
Owner

@mostlygeek mostlygeek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi,

Please revert the changes to event/default_test.go and event/event_test.go.

Everything else looks good.

@Yandrik
Copy link
Contributor Author

Yandrik commented Aug 28, 2025

Ah, sorry, just saw the formatting made that very messy - apologies.
Files reverted :)

@mostlygeek mostlygeek merged commit 977f185 into mostlygeek:main Aug 29, 2025
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants