Conversation
|
Important Review skippedDraft detected. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the WalkthroughAdds a new PeerConfig type and a Peers map field to Config for YAML-based peer definitions; updates POSIX config test and fixtures to include and validate a desktop peer entry. Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (2)
proxy/config.go (1)
165-165: Make peers field omit-empty on marshal.Keeps serialized configs tidy when no peers are set.
- Peers map[string]PeerConfig `yaml:"peers"` /* key is peer ID */ + Peers map[string]PeerConfig `yaml:"peers,omitempty"` /* key is peer ID */proxy/config_posix_test.go (1)
242-250: Align expected struct with APIKey rename (if adopted).Update field name in the expected value.
- ApiKey: "secret-key", + APIKey: "secret-key",Optionally, add a redaction assertion once Stringer is in place:
t.Run("peers redact API keys in fmt output", func(t *testing.T) { s := fmt.Sprintf("%v", config.Peers["desktop"]) assert.NotContains(t, s, "secret-key") assert.Contains(t, s, "REDACTED") })
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
proxy/config.go(2 hunks)proxy/config_posix_test.go(2 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
proxy/config_posix_test.go (1)
proxy/config.go (1)
PeerConfig(149-155)
🔇 Additional comments (1)
proxy/config_posix_test.go (1)
151-157: LGTM: Fixture exercises the new peers mapping.
the configurations are getting a bit too complicated and has outgrown the proxy package. Moving them to their own package.
|
Closing this PR as I don't have time to work on it for now. The config changes were cherry-picked into #329 . |
This PR allows a single llama-swap to be the central proxy for models served by other inference servers. The peer servers can be another llama-swap or any API that supports the /v1/* inference endpoint. Updates: mostlygeek#433, mostlygeek#299 Closes: mostlygeek#296
Summary by CodeRabbit
New Features
Tests