Skip to content

Add GitHub Copilot CLI as a provider#8108

Closed
vincenzopalazzo wants to merge 5 commits into
aaif-goose:mainfrom
vincenzopalazzo:claude/gifted-villani
Closed

Add GitHub Copilot CLI as a provider#8108
vincenzopalazzo wants to merge 5 commits into
aaif-goose:mainfrom
vincenzopalazzo:claude/gifted-villani

Conversation

@vincenzopalazzo
Copy link
Copy Markdown
Contributor

Summary

  • Add new copilot-cli provider that wraps the GitHub Copilot CLI tool, following the same pattern as gemini-cli and cursor-agent providers
  • Add COPILOT_CLI_COMMAND config value (defaults to copilot)
  • Register provider in factory and module system
  • Update CLI providers documentation with setup instructions, configuration table, and feature descriptions

Closes #8104

Test plan

  • Verify cargo check and cargo clippy pass (confirmed locally)
  • Install GitHub Copilot CLI and configure with GOOSE_PROVIDER=copilot-cli
  • Start a goose session and verify prompts are sent to copilot CLI and responses returned
  • Verify COPILOT_CLI_COMMAND env var override works for custom paths
  • Test model selection with -m flag for known models

🤖 Generated with Claude Code

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 2a55e49720

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +105 to +107
if COPILOT_CLI_KNOWN_MODELS.contains(&self.model.model_name.as_str()) {
cmd.arg("-m").arg(&self.model.model_name);
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Use --model when setting Copilot CLI model

Copilot CLI’s documented/programmatic flag is --model=MODEL (as shown in copilot help and GitHub’s CLI command reference), but this code passes -m. Because the provider passes this branch for known models (including the default gpt-4.1), requests can fail with an unknown-option error and make the new provider unusable when a model is selected.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in a750652. Changed -m to --model per Copilot CLI's documented flag.

Comment on lines +112 to +115
cmd.arg("-p").arg(&sanitized_prompt).arg("--yolo");
} else {
cmd.arg("-p").arg(&full_prompt).arg("--yolo");
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Suppress non-response output in Copilot CLI mode

The response parser concatenates all stdout lines into assistant text, but Copilot CLI only guarantees response-only output when -s/--silent is used; otherwise it includes stats/decoration in programmatic mode. Without --silent, usage/metadata lines are persisted as model output, which pollutes transcripts and can degrade follow-up prompt quality.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in a750652. Added --silent flag to the command to suppress stats/decoration lines from stdout.

Adds a new `copilot-cli` provider that wraps the GitHub Copilot CLI tool,
allowing users with a Copilot subscription to use it through goose's
session management and workflow integration.

Closes aaif-goose#8104

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Vincenzo Palazzo <vincenzopalazzodev@gmail.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Vincenzo Palazzo <vincenzopalazzodev@gmail.com>
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: a84bcdee2a

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +235 to +236
let trimmed = line.trim();
if !trimmed.is_empty() {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Preserve response whitespace when streaming Copilot output

Use trim() here only for emptiness checks, not for emitted content: this strips leading indentation and all trailing/newline whitespace from every line before yielding it. Because downstream coalescing in collect_stream concatenates text chunks directly, multiline/code responses from Copilot CLI are flattened (for example, indented code blocks lose indentation and line boundaries), which can materially corrupt generated output.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in a750652. Now only stripping trailing newline/CR, preserving leading indentation in streamed content.

- Use --model instead of -m for Copilot CLI model selection
- Add --silent flag to suppress stats/decoration output
- Preserve leading whitespace in streamed output lines

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Vincenzo Palazzo <vincenzopalazzodev@gmail.com>
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: a750652cf6

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment thread crates/goose/src/providers/copilot_cli.rs Outdated
Comment thread crates/goose/src/providers/copilot_cli.rs Outdated
Remove the COPILOT_CLI_KNOWN_MODELS gate so any valid model name is
forwarded to the CLI. Add --no-ask-user to prevent hangs when Copilot
triggers clarification prompts in non-interactive mode.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 94d5cd3bc3

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +233 to +235
if !line.trim().is_empty() {
let content = line.trim_end_matches('\n').trim_end_matches('\r');
lines.push(content.to_string());
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Preserve line breaks when emitting Copilot stream chunks

This stream path still removes \n/\r from every line and drops whitespace-only lines, then collect_stream coalesces chunks via push_str without reinserting delimiters, so multiline/code responses are flattened and blank lines are lost. Fresh evidence after the earlier trim fix: content is now trim_end_matches(...), but no newline is ever preserved or re-added before yielding, so formatting corruption still occurs whenever Copilot returns more than one line.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in bbb491a. Now preserving trailing newlines in yielded text chunks and no longer dropping blank lines, so collect_stream concatenation keeps multiline formatting intact.

Keep trailing newlines when yielding partial text chunks so that
collect_stream concatenation preserves multiline formatting. Also
yield blank lines to maintain paragraph breaks in code responses.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@DOsinga
Copy link
Copy Markdown
Collaborator

DOsinga commented Mar 25, 2026

Hi @vincenzopalazzo — thanks for the PR, and genuinely, a lot of your contributions have been right on point for targeted fixes. We really appreciate that energy.

For this one though, we need to close it — it doesn't implement what the underlying issue (#8104) asked for. Issue #8104 requests a Copilot ACP provider, which means wiring up the Copilot CLI's --acp flag into the existing ACP provider framework. The right implementation looks almost identical to gemini_acp.rs — it's about 100 lines and delegates everything to AcpProvider.

What was submitted instead is a CLI text-prompt wrapper that shells out with -p, which is a fundamentally different approach — and one that's also largely redundant, since we already have a full github_copilot provider that talks to the Copilot API directly.

We'd love to see a follow-up PR that actually implements the ACP variant, but please read the issue carefully first rather than pointing an AI coding tool at the title and letting it run. The pattern to follow is right there in the codebase.

@DOsinga DOsinga closed this Mar 25, 2026
@vincenzopalazzo
Copy link
Copy Markdown
Contributor Author

Hi @DOsinga thanks for poining me in the right direction I will work on a folloup PR, thanks

@vincenzopalazzo vincenzopalazzo deleted the claude/gifted-villani branch March 25, 2026 14:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support GitHub Copilot CLI as an ACP Provider

2 participants