Add local-inference feature flag to gate llama-cpp-2 dependency#7648
Add local-inference feature flag to gate llama-cpp-2 dependency#7648r0x0d wants to merge 2 commits into
local-inference feature flag to gate llama-cpp-2 dependency#7648Conversation
|
This is the first of a series of changes to add more feature flags to goose to enable disabling certain aspects during build time. We have discussed that in the past in Discord, so I will link it up in the description |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 5279e0d3a6
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
5279e0d to
f6bb6b8
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: f6bb6b81b4
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
|
Thanks for this. Please make sure that all the relevant crates (goose-server, the ACP server, the CLI) have a Also, the frontend will need to handle 404 on the local inference endpoints and hide/disable that provider & the local inference tab in config. |
For sure! I will take care of this, thanks! |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: d9e987dd7f
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
d9e987d to
d8abd8c
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: d8abd8cdab
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
d8abd8c to
e1f7f96
Compare
4cb7a8b to
e4fea42
Compare
ef27049 to
d8e78b4
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: d8e78b414c
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| #[cfg(feature = "code-mode")] | ||
| pub async fn run_prompt_codemode<C: Connection>() { |
There was a problem hiding this comment.
Gate all code-mode ACP tests consistently
Adding #[cfg(feature = "code-mode")] here removes run_prompt_codemode from no-code-mode builds, but crates/goose-acp/tests/provider_test.rs still imports and invokes it unconditionally (use ... run_prompt_codemode and test_prompt_codemode). That makes cargo test -p goose-acp --no-default-features fail to compile with an unresolved symbol, so the no-default-features path is currently broken.
Useful? React with 👍 / 👎.
| const activeProviders = providersResponse.filter( | ||
| (provider) => provider.is_configured && (provider.name !== 'local' || localInferenceAvailable) | ||
| ); |
There was a problem hiding this comment.
Reset stale local provider when filtering unavailable providers
This filter can remove local from activeProviders while the modal state still holds provider === 'local' (for example, a user with persisted local config opening a build without local inference). In that state, submit still uses the stale provider and getProviderMetadata('local', getProviders) throws (modelInterface.ts), so model switching fails instead of gracefully falling back to a valid provider.
Useful? React with 👍 / 👎.
54e85d2 to
2687540
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 26875402e1
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| const activeProviders = providersResponse.filter( | ||
| (provider) => provider.is_configured && (provider.name !== 'local' || localInferenceAvailable) | ||
| ); |
There was a problem hiding this comment.
Reset stale local selection when local provider is filtered
Removing local from activeProviders when localInferenceAvailable is false does not clear the modal's existing provider/model state, so users with persisted local settings can still submit provider === "local" even though it is no longer selectable. Fresh evidence: submit later calls getProviderMetadata(provider || '', getProviders) in this same component; in no-local-inference builds the providers API omits local, causing that lookup to throw and the model-switch flow to fail.
Useful? React with 👍 / 👎.
Gate the local LLM inference provider behind a `local-inference` feature flag (default ON) so downstream builds can opt out of the heavy llama-cpp-2 native dependency and its libclang build requirement. When disabled, the LocalInferenceProvider is not registered, the server local_inference routes are excluded, and the CLI `local-models` command is removed. Made-with: Cursor Relates to aaif-goose#5716 Signed-off-by: Rodolfo Olivieri <rodolfo.olivieri3@gmail.com> Co-authored-by: Claude (Anthropic) <noreply@anthropic.com>
Signed-off-by: Rodolfo Olivieri <rodolfo.olivieri3@gmail.com>
2687540 to
8632e4d
Compare
|
closing this since #7976 got merged first. |
Summary
Gate the local LLM inference provider behind a
local-inferencefeature flag (default ON) so downstream builds can opt out of the heavy llama-cpp-2 native dependency and its libclang build requirement.When disabled, the LocalInferenceProvider is not registered, the server local_inference routes are excluded, and the CLI
local-modelscommand is removed.Made-with: Cursor
Type of Change
AI Assistance
Testing
Tested locally with
cargo test --no-default-featuresand withcargo testRelated Issues
Relates to #5716
Discussion: LINK (if any)
Screenshots/Demos (for UX changes)
Before:
After: