Skip to content

Conversation

@alexhancock
Copy link
Collaborator

Implements basic support for MCP Sampling

  • MCP servers can now request sampling and it will be handled by goose to route a sampling message to the provider for completion
  • Works for desktop and CLI
  • Does not yet implement human in the loop approval of these, which is a SHOULD for clients in the MCP spec, but added significant overhead to the initial implementation so the idea is to get a scoped and working implementation out and then iterate
  • Sampling is supposed to be text, image, or audio. This does not handle audio, because interestingly goose does not handle audio message content. Can be a followup when we support it generally.

Demos of it working in CLI and Desktop with the everything server the prompt

Use the sampleLLM tool in the everything extension to have the mcp server request sampling from the model. Have the mcp server ask the model for a quote from the great gatsby

Screenshot 2025-10-24 at 5 28 49 PM Screenshot 2025-10-24 at 5 31 13 PM

Copy link
Collaborator

@jamadeo jamadeo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice!

ClientHandler, Peer, RoleClient, ServiceError, ServiceExt,
ClientHandler, ErrorData as McpError, ErrorData, Peer, RoleClient, ServiceError, ServiceExt,
};
use schemars::_private::NoSerialize;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suspicious-looking import!

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It was needed for e.maybe_to_value() but no longer using that, so it's gone!


let result = extension_manager
.add_extension(config)
.add_extension(config, None)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

probably fine for now, but I suppose we will want to allow this to work too

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In theory I fixed this in the most recent commit, which changes the approach of how the extension manager has a reference to a provider. Didn't text explicitly for sampling through extension_manager managed extensions, but I think it may work now. I'll add a note to follow up on it.

Copy link
Collaborator

@DOsinga DOsinga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This generally looks good and would allow us to ship something. I'm a little concerned on how we handle providers. within the context of mcp, I don't think they should be optional - we can't have an MCP call without a provider, right?

the other is that we now set a provider on the extension_manager. in some scenarios we might not have it yet, maybe that's why it is optional? but I don't think we can be sure we're using the right provider if the user switches models, providers etc.

we should use the provider of the agent I think.

ErrorData::new(
ErrorCode::INTERNAL_ERROR,
"Unexpected error while completing the prompt",
e.maybe_to_value(),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why maybe_to_value()? I think ti would return None here, we should probably return the relevant string though

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adjusted

@alexhancock alexhancock force-pushed the alexhancock/mcp-sampling branch 8 times, most recently from 30943fd to c84f539 Compare October 28, 2025 21:11
@alexhancock
Copy link
Collaborator Author

alexhancock commented Oct 28, 2025

@jamadeo @DOsinga Updated it to always uses the agent's current provider via a shared reference. Verified by the llm_request logs that it uses different models for sampling after switching providers.

@alexhancock alexhancock force-pushed the alexhancock/mcp-sampling branch from c84f539 to 93afbd6 Compare October 29, 2025 13:14
Copy link
Collaborator

@DOsinga DOsinga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

how do we test this?

})?;

// convert back to MCP messages
let response_content = if let Some(content) = response.content.first() {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

only first?

@alexhancock alexhancock force-pushed the alexhancock/mcp-sampling branch 5 times, most recently from fa587e7 to 1680290 Compare October 29, 2025 18:49
@alexhancock alexhancock force-pushed the alexhancock/mcp-sampling branch from 1680290 to 3e2bc35 Compare October 29, 2025 20:43
@alexhancock alexhancock merged commit 06b7a63 into main Oct 29, 2025
14 checks passed
@alexhancock alexhancock deleted the alexhancock/mcp-sampling branch October 29, 2025 21:06
@alexhancock
Copy link
Collaborator Author

Draft of a couple different approaches to integration testing for sampling #5456

Wanted to make a separate PR to get feedback on which one seems most valuable

@jamadeo @DOsinga

fbalicchia pushed a commit to fbalicchia/goose that referenced this pull request Nov 7, 2025
BlairAllan pushed a commit to BlairAllan/goose that referenced this pull request Nov 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants