-
Notifications
You must be signed in to change notification settings - Fork 459
Description
I’m running Firecrawl locally (Docker). The local API responds successfully to POST /v1/scrape (200).
The Firecrawl-MCP server is loaded in LM Studio 0.3.23, initialized with API URL: http://localhost:3002/v1, and registers as a tool.
However, when triggering a tool call from within LM Studio, the model still replies “API key required” / “I don’t have access to the Firecrawl API key”, as if it were using the Cloud service.
This happens regardless of which model I use or whether I wrap the startup in a hardfix script.
Environment
OS: Windows (tested with PowerShell and cmd)
LM Studio: 0.3.23
Firecrawl (self-hosted): via Docker Compose
api running on port 3002
Logs show: “Authentication is disabled …”
firecrawl-mcp-server: via npx -y firecrawl-mcp (also tested @latest)
Node/npm: Standard installation in C:\Program Files\nodejs\ (npx available)
Models tested in LM Studio:
openai/gpt-oss-20b
qwen/qwen3-coder-30b
also tried Qwen/Llama instruct models
Note: No FIRECRAWL_API_KEY is set anywhere (neither User/System env nor in mcp.json).
LM Studio should send tool calls through the MCP to my local Firecrawl API without requiring a Cloud API key.
Models should output the tool response (e.g. Markdown for https://example.com).
Despite correct MCP initialization and tool registration, the model replies:
“I’m sorry, but I don’t have access to the Firecrawl API key …”
or
“API key required.”
Looks like a Cloud fallback or a generic error message, even though FIRECRAWL_API_URL is set and the local API returns 200.