This repository contains Model Context Protocol (MCP) servers that provide various tools and capabilities for AI models. It was mainly done to have small examples to show for LocalAI, but works as well with any MCP client.
A web search server that provides search capabilities using DuckDuckGo.
Features:
- Web search functionality
- Configurable maximum results (default: 5)
- JSON schema validation for inputs/outputs
Tool:
search
- Search the web for information
Configuration:
MAX_RESULTS
- Environment variable to set maximum number of search results (default: 5)
Docker Image:
docker run -e MAX_RESULTS=10 ghcr.io/mudler/mcps/duckduckgo:latest
LocalAI configuration ( to add to the model config):
mcp:
stdio: |
{
"mcpServers": {
"ddg": {
"command": "docker",
"env": {
"MAX_RESULTS": "10"
},
"args": [
"run", "-i", "--rm", "-e", "MAX_RESULTS",
"ghcr.io/mudler/mcps/duckduckgo:master"
]
}
}
}
A weather information server that provides current weather and forecast data for cities worldwide.
Features:
- Current weather conditions (temperature, wind, description)
- Multi-day weather forecast
- URL encoding for city names with special characters
- JSON schema validation for inputs/outputs
- HTTP timeout handling
Tool:
get_weather
- Get current weather and forecast for a city
API Response Format:
{
"temperature": "29 °C",
"wind": "20 km/h",
"description": "Partly cloudy",
"forecast": [
{
"day": "1",
"temperature": "27 °C",
"wind": "12 km/h"
},
{
"day": "2",
"temperature": "22 °C",
"wind": "8 km/h"
}
]
}
Docker Image:
docker run ghcr.io/mudler/mcps/weather:latest
LocalAI configuration ( to add to the model config):
mcp:
stdio: |
{
"mcpServers": {
"weather": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"ghcr.io/mudler/mcps/weather:master"
]
}
}
}
A persistent memory storage server that allows AI models to store, retrieve, and manage information across sessions.
Features:
- Persistent JSON file storage
- Add, list, and remove memory entries
- Unique ID generation for each entry
- Timestamp tracking for entries
- Configurable storage location
- JSON schema validation for inputs/outputs
Tools:
add_memory
- Add a new entry to memory storagelist_memory
- List all memory entriesremove_memory
- Remove a memory entry by IDsearch_memory
- Search memory entries by content (case-insensitive)
Configuration:
MEMORY_FILE_PATH
- Environment variable to set the memory file path (default:/data/memory.json
)
Memory Entry Format:
{
"id": "1703123456789000000",
"content": "User prefers coffee over tea",
"created_at": "2023-12-21T10:30:56.789Z"
}
Search Response Format:
{
"query": "coffee",
"results": [
{
"id": "1703123456789000000",
"content": "User prefers coffee over tea",
"created_at": "2023-12-21T10:30:56.789Z"
}
],
"count": 1
}
Docker Image:
docker run -e MEMORY_FILE_PATH=/custom/path/memory.json ghcr.io/mudler/mcps/memory:latest
LocalAI configuration ( to add to the model config):
mcp:
stdio: |
{
"mcpServers": {
"memory": {
"command": "docker",
"env": {
"MEMORY_FILE_PATH": "/data/memory.json"
},
"args": [
"run", "-i", "--rm", "-v", "/host/data:/data",
"ghcr.io/mudler/mcps/memory:master"
]
}
}
}
- Go 1.24.7 or later
- Docker (for containerized builds)
- Make (for using the Makefile)
Use the provided Makefile for easy development:
# Show all available commands
make help
# Development workflow
make dev
# Build specific server
make MCP_SERVER=duckduckgo build
make MCP_SERVER=weather build
make MCP_SERVER=memory build
# Run tests and checks
make ci-local
# Build multi-architecture images
make build-multiarch
To add a new MCP server:
- Create a new directory under the project root
- Implement the server following the MCP SDK patterns
- Update the GitHub Actions workflow matrix in
.github/workflows/image.yml
- Update this README with the new server information
Example server structure:
package main
import (
"context"
"log"
"github.com/modelcontextprotocol/go-sdk/mcp"
)
func main() {
server := mcp.NewServer(&mcp.Implementation{
Name: "your-server",
Version: "v1.0.0"
}, nil)
// Add your tools here
mcp.AddTool(server, &mcp.Tool{
Name: "your-tool",
Description: "your tool description"
}, YourToolFunction)
if err := server.Run(context.Background(), &mcp.StdioTransport{}); err != nil {
log.Fatal(err)
}
}
Docker images are automatically built and pushed to GitHub Container Registry:
ghcr.io/mudler/mcps/duckduckgo:latest
- Latest DuckDuckGo serverghcr.io/mudler/mcps/duckduckgo:v1.0.0
- Tagged versionsghcr.io/mudler/mcps/duckduckgo:master
- Development versionsghcr.io/mudler/mcps/weather:latest
- Latest Weather serverghcr.io/mudler/mcps/weather:v1.0.0
- Tagged versionsghcr.io/mudler/mcps/weather:master
- Development versionsghcr.io/mudler/mcps/memory:latest
- Latest Memory serverghcr.io/mudler/mcps/memory:v1.0.0
- Tagged versionsghcr.io/mudler/mcps/memory:master
- Development versions
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Run
make ci-local
to ensure all checks pass - Submit a pull request
This project is licensed under the terms specified in the LICENSE file.
This project implements servers for the Model Context Protocol (MCP), a standard for connecting AI models to external data sources and tools.
For more information about MCP, visit the official documentation.