Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
54 changes: 48 additions & 6 deletions docs/clients/sampling.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -177,11 +177,15 @@ client = Client(
)
```

## Using the OpenAI Handler
## Built-in Handlers

<VersionBadge version="2.14.0" />
FastMCP provides built-in sampling handlers for OpenAI and Anthropic APIs. These handlers support the full sampling API including tool use, handling message conversion and response formatting automatically.

For full-featured sampling with tool support, use the built-in OpenAI handler. It handles message conversion, tool calls, and response formatting automatically:
### OpenAI Handler

<VersionBadge version="2.11.0" />

The OpenAI handler works with OpenAI's API and any OpenAI-compatible provider:

```python
from fastmcp import Client
Expand All @@ -193,7 +197,7 @@ client = Client(
)
```

The handler works with any OpenAI-compatible API by passing a custom client:
For OpenAI-compatible APIs (like local models), pass a custom client:

```python
from openai import AsyncOpenAI
Expand All @@ -208,9 +212,47 @@ client = Client(
```

<Note>
Tool execution happens on the server side. The client's role is to pass tools to the LLM and return the LLM's response (which may include tool use requests). The server then executes the tools and may send follow-up sampling requests with tool results.
Install the OpenAI handler with `pip install fastmcp[openai]`.
</Note>

### Anthropic Handler

<VersionBadge version="2.14.1" />

The Anthropic handler uses Claude models via the Anthropic API:

```python
from fastmcp import Client
from fastmcp.client.sampling.handlers.anthropic import AnthropicSamplingHandler

client = Client(
"my_mcp_server.py",
sampling_handler=AnthropicSamplingHandler(default_model="claude-sonnet-4-5"),
)
```

You can pass a custom client for advanced configuration:

```python
from anthropic import AsyncAnthropic

client = Client(
"my_mcp_server.py",
sampling_handler=AnthropicSamplingHandler(
default_model="claude-sonnet-4-5",
client=AsyncAnthropic(), # Uses ANTHROPIC_API_KEY env var
),
)
```

<Note>
Install the Anthropic handler with `pip install fastmcp[anthropic]`.
</Note>
Comment on lines +218 to 250
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

# Find AnthropicSamplingHandler implementation
find . -type f -name "*.py" | xargs grep -l "AnthropicSamplingHandler" | head -5

Repository: jlowin/fastmcp

Length of output: 109


🏁 Script executed:

# Check the AnthropicSamplingHandler definition
find . -path "*/client/sampling*" -name "*.py" -type f

Repository: jlowin/fastmcp

Length of output: 251


🏁 Script executed:

# Search for handler implementations in the codebase
fd "anthropic" --type f -e py | grep -i handler

Repository: jlowin/fastmcp

Length of output: 107


🌐 Web query:

Anthropic Python SDK best practices for API key handling in documentation examples and AsyncAnthropic vs Anthropic client usage

💡 Result:

Summary and recommendations

  1. API-key handling in documentation examples — best practices
  • Never hard-code keys in examples; read from ANTHROPIC_API_KEY (env var) instead. [1][2]
  • Show use of dotenv/secrets only in local/dev examples and remind readers to add .env to .gitignore. [1][2]
  • Recommend cloud/CI secret managers (KMS, Vault, GitHub Actions secrets) for deployments and CI; avoid pasting keys into third-party tools unless encrypted. [1]
  • Encourage key rotation, per-environment keys (dev/test/prod), monitoring/logging, and least-privilege use. [1]
  • Avoid printing keys or committing them to repos; demonstrate safe checks (e.g., a quick auth-test call rather than echoing the key). [1][3]

Minimal example to show in docs (safe pattern):

  • Python (sync): import os; client = Anthropic(api_key=os.getenv("ANTHROPIC_API_KEY")) # recommend omitting api_key since SDK reads env by default. [2]
  • Note: for Claude Code, advise keeping ANTHROPIC_API_KEY unset if using a subscription to avoid unexpected charges. [4]
  1. AsyncAnthropic vs Anthropic — when & how to use
  • Anthropic: synchronous client for simple scripts or blocking workflows. Use client.messages.create(...) directly. [2]
  • AsyncAnthropic: async client for concurrent/high-throughput use—use await for calls and prefer async context managers when available; enables asyncio.gather and streaming patterns. Example: async_client = AsyncAnthropic(...); await async_client.messages.create(...). [2][3]
  • Both accept an api_key parameter but default to reading ANTHROPIC_API_KEY from the environment; prefer env-based usage in examples. [2][3]
  • Practical rule: use Anthropic in quick CLI scripts or examples; show AsyncAnthropic in docs for concurrency and production async apps. Also show explicit client.close() or async context usage to avoid resource leaks in examples. [2][3]

Sources
[1] Anthropic — API Key Best Practices (support article). (support.anthropic.com)
[2] anthropics/anthropic-sdk-python README (usage + AsyncAnthropic examples). (github.com)
[3] Anthropic docs — Getting started / Authentication. (docs.anthropic.com)
[4] Anthropic — Managing API Key Environment Variables in Claude Code. (support.anthropic.com)

Citations:


🏁 Script executed:

cat -n src/fastmcp/client/sampling/handlers/anthropic.py

Repository: jlowin/fastmcp

Length of output: 15874


🏁 Script executed:

# Check for async patterns in the file
grep -n "async\|await\|AsyncAnthropic" src/fastmcp/client/sampling/handlers/anthropic.py

Repository: jlowin/fastmcp

Length of output: 84


🏁 Script executed:

# Look at the imports and class structure
head -50 src/fastmcp/client/sampling/handlers/anthropic.py

Repository: jlowin/fastmcp

Length of output: 1444


Use environment variables for API keys; avoid hard-coded placeholders in docs.

The example Anthropic(api_key="your-key") encourages committing secrets to code. Instead, use Anthropic() (relies on ANTHROPIC_API_KEY env var by default) or explicitly pass os.environ["ANTHROPIC_API_KEY"]. The handler accepts only the synchronous Anthropic client, not AsyncAnthropic.

Suggested edit:

-from anthropic import Anthropic
+import os
+from anthropic import Anthropic
 
 client = Client(
     "my_mcp_server.py",
     sampling_handler=AnthropicSamplingHandler(
         default_model="claude-sonnet-4-5",
-        client=Anthropic(api_key="your-key"),
+        client=Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"]),
     ),
 )

Or simpler, omit api_key entirely since the SDK reads ANTHROPIC_API_KEY by default.

🤖 Prompt for AI Agents
In docs/clients/sampling.mdx around lines 218 to 248, the example shows a
hard-coded API key (Anthropic(api_key="your-key")) which encourages committing
secrets and is inaccurate about client types; update the examples to use the
default environment-driven constructor (Anthropic()) or pass
os.environ["ANTHROPIC_API_KEY"] so keys come from ANTHROPIC_API_KEY, and note
that the sampling handler requires the synchronous Anthropic client (not
AsyncAnthropic); remove the literal "your-key" placeholder and adjust the text
to recommend installing via pip and relying on env vars for credentials.


### Tool Execution

Tool execution happens on the server side. The client's role is to pass tools to the LLM and return the LLM's response (which may include tool use requests). The server then executes the tools and may send follow-up sampling requests with tool results.

<Tip>
To implement a custom sampling handler, see the [OpenAISamplingHandler source code](https://github.com/jlowin/fastmcp/blob/main/src/fastmcp/client/sampling/handlers/openai.py) as a reference.
To implement a custom sampling handler, see the [handler source code](https://github.com/jlowin/fastmcp/tree/main/src/fastmcp/client/sampling/handlers) as a reference.
</Tip>
32 changes: 16 additions & 16 deletions docs/servers/sampling.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -443,32 +443,32 @@ tool_result = ToolResultContent(

Client support for sampling is optional—some clients may not implement it. To ensure your tools work regardless of client capabilities, configure a `sampling_handler` that sends requests directly to an LLM provider.

### OpenAI Handler

FastMCP provides an OpenAI-compatible handler that works with OpenAI's API and compatible providers. It supports the full sampling API including tools, automatically converting your Python functions to OpenAI's function calling format.
FastMCP provides built-in handlers for [OpenAI and Anthropic APIs](/clients/sampling#built-in-handlers). These handlers support the full sampling API including tools, automatically converting your Python functions to each provider's format.

<Note>
The OpenAI handler requires the `openai` package. Install it with:
```bash
pip install fastmcp[openai]
# or
pip install openai
```
You'll also need to set the `OPENAI_API_KEY` environment variable or pass it directly to the client.
Install handlers with `pip install fastmcp[openai]` or `pip install fastmcp[anthropic]`.
</Note>

```python
import os
from openai import OpenAI
from fastmcp import FastMCP
from fastmcp.client.sampling.handlers.openai import OpenAISamplingHandler

server = FastMCP(
name="My Server",
sampling_handler=OpenAISamplingHandler(
default_model="gpt-4o-mini",
client=OpenAI(api_key=os.getenv("OPENAI_API_KEY")),
),
sampling_handler=OpenAISamplingHandler(default_model="gpt-4o-mini"),
sampling_handler_behavior="fallback",
)
```

Or with Anthropic:

```python
from fastmcp import FastMCP
from fastmcp.client.sampling.handlers.anthropic import AnthropicSamplingHandler

server = FastMCP(
name="My Server",
sampling_handler=AnthropicSamplingHandler(default_model="claude-sonnet-4-5"),
sampling_handler_behavior="fallback",
)
```
Comment on lines +446 to 474
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Add the missing prerequisite for Anthropic: how you provide credentials.
The examples are copy/pasteable, but you should tell the reader to set ANTHROPIC_API_KEY (or show it in code) so the snippet runs.

Suggested doc tweak (minimal):

  • Add a sentence (or a <Note>) near the install note saying you set OPENAI_API_KEY / ANTHROPIC_API_KEY in the environment before running the server.
🤖 Prompt for AI Agents
In docs/servers/sampling.mdx around lines 446 to 474, the Anthropic and OpenAI
example snippets don't mention how to supply credentials; add a short Note after
the install line stating that users must set OPENAI_API_KEY or ANTHROPIC_API_KEY
in their environment (or show how to provide equivalent credentials) before
running the server so the handler can authenticate; keep the wording minimal and
mirror the existing Note style.

Expand Down
3 changes: 2 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -47,12 +47,13 @@ classifiers = [
]

[project.optional-dependencies]
anthropic = ["anthropic>=0.40.0"]
openai = ["openai>=1.102.0"]

[dependency-groups]
dev = [
"dirty-equals>=0.9.0",
"fastmcp[openai]",
"fastmcp[anthropic,openai]",
# add optional dependencies for fastmcp dev
"fastapi>=0.115.12",
"inline-snapshot[dirty-equals]>=0.27.2",
Expand Down
Loading
Loading