From 86f716f5a32fdbd96c917b7bb137081562bffe03 Mon Sep 17 00:00:00 2001 From: Jax Liu Date: Fri, 6 Mar 2026 14:40:28 +0800 Subject: [PATCH 1/5] feat(skills): add wren-quickstart and wren-mcp-setup skills MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - Add wren-quickstart skill: end-to-end onboarding flow that orchestrates generate-mdl, wren-project, and wren-mcp-setup from zero to first query - Add wren-mcp-setup skill: Docker container setup + MCP client registration - Rename mdl-project → wren-project for consistency with skill naming convention - Update install.sh, README.md, and SKILLS.md to reflect new skills Co-Authored-By: Claude Sonnet 4.6 --- skills/README.md | 12 +- skills/SKILLS.md | 45 ++- skills/generate-mdl/SKILL.md | 96 +++-- skills/install.sh | 2 +- skills/wren-mcp-setup/SKILL.md | 382 ++++++++++++++++++ skills/{mdl-project => wren-project}/SKILL.md | 2 +- skills/wren-quickstart/SKILL.md | 293 +++++--------- 7 files changed, 602 insertions(+), 230 deletions(-) create mode 100644 skills/wren-mcp-setup/SKILL.md rename skills/{mdl-project => wren-project}/SKILL.md (99%) diff --git a/skills/README.md b/skills/README.md index 747f96e4f..2763a2c68 100644 --- a/skills/README.md +++ b/skills/README.md @@ -32,26 +32,28 @@ npx openskills add Canner/wren-engine ```bash cp -r skills/generate-mdl ~/.claude/skills/ # or all at once: -cp -r skills/generate-mdl skills/mdl-project skills/wren-sql skills/wren-quickstart ~/.claude/skills/ +cp -r skills/generate-mdl skills/wren-project skills/wren-sql skills/wren-mcp-setup skills/wren-quickstart ~/.claude/skills/ ``` Once installed, invoke a skill by name in your conversation: ```text +/wren-quickstart /generate-mdl -/mdl-project +/wren-project /wren-sql -/wren-quickstart +/wren-mcp-setup ``` ## Available Skills | Skill | Description | |-------|-------------| +| [wren-quickstart](wren-quickstart/SKILL.md) | End-to-end quickstart — install skills, generate MDL, save project, start MCP server, and verify setup | | [generate-mdl](generate-mdl/SKILL.md) | Generate a Wren MDL manifest from a live database using ibis-server introspection | -| [mdl-project](mdl-project/SKILL.md) | Save, load, and build MDL manifests as version-controlled YAML project directories | +| [wren-project](wren-project/SKILL.md) | Save, load, and build MDL manifests as version-controlled YAML project directories | | [wren-sql](wren-sql/SKILL.md) | Write and correct SQL queries for Wren Engine — types, date/time, BigQuery dialect, error diagnosis | -| [wren-quickstart](wren-quickstart/SKILL.md) | Set up Wren Engine MCP via Docker and connect to Claude Code or other MCP clients | +| [wren-mcp-setup](wren-mcp-setup/SKILL.md) | Set up Wren Engine MCP via Docker, register with Claude Code or other MCP clients, and start querying | See [SKILLS.md](SKILLS.md) for full details on each skill. diff --git a/skills/SKILLS.md b/skills/SKILLS.md index 861b98265..e7bf451ba 100644 --- a/skills/SKILLS.md +++ b/skills/SKILLS.md @@ -4,6 +4,37 @@ Skills are instruction files that extend AI agents with Wren-specific workflows. --- +## wren-quickstart + +**File:** [wren-quickstart/SKILL.md](wren-quickstart/SKILL.md) + +End-to-end onboarding guide for Wren Engine. Orchestrates the full setup flow — from installing skills and creating a workspace, to generating an MDL, saving it as a versioned project, starting the MCP Docker container, and verifying everything works. + +### When to use + +- Setting up Wren Engine for the first time +- Onboarding a new data source from scratch +- Getting a new team member started with Wren MCP + +### Workflow summary + +1. Install required skills via `install.sh` +2. Create a workspace directory on the host machine +3. Generate MDL from the database (`@generate-mdl`) +4. Save as a YAML project and compile to `target/` (`@wren-project`) +5. Start the Docker container and register the MCP server (`@wren-mcp-setup`) +6. Run `health_check()` to verify — then start a new session and query + +### Dependent skills + +| Skill | Purpose | +|-------|---------| +| `@generate-mdl` | Introspect database and build MDL JSON | +| `@wren-project` | Save MDL as YAML project + compile to `target/` | +| `@wren-mcp-setup` | Start Docker container and register MCP server | + +--- + ## generate-mdl **File:** [generate-mdl/SKILL.md](generate-mdl/SKILL.md) @@ -33,14 +64,14 @@ Generates a complete Wren MDL manifest by introspecting a live database through 5. Optionally sample data for ambiguous columns 6. Build the MDL JSON (models, columns, relationships) 7. Validate via `mdl_validate_manifest` -8. Optionally save as a YAML project (see `mdl-project`) +8. Optionally save as a YAML project (see `wren-project`) 9. Deploy via `deploy_manifest` --- -## mdl-project +## wren-project -**File:** [mdl-project/SKILL.md](mdl-project/SKILL.md) +**File:** [wren-project/SKILL.md](wren-project/SKILL.md) Manages Wren MDL manifests as human-readable YAML project directories — similar to dbt projects. Makes MDL version-control friendly by splitting the monolithic JSON into one YAML file per model. @@ -112,11 +143,11 @@ Comprehensive SQL authoring and debugging guide for Wren Engine. Covers core que --- -## wren-quickstart +## wren-mcp-setup -**File:** [wren-quickstart/SKILL.md](wren-quickstart/SKILL.md) +**File:** [wren-mcp-setup/SKILL.md](wren-mcp-setup/SKILL.md) -Sets up Wren Engine MCP server via Docker and connects it to Claude Code (or another MCP client) over streamable-http. +Sets up Wren Engine MCP server via Docker, registers it with an AI agent (Claude Code or other MCP clients), and starts a new session to begin interacting with Wren. ### When to use @@ -150,6 +181,6 @@ Then invoke in your AI client: ``` /generate-mdl -/mdl-project +/wren-project /wren-sql ``` diff --git a/skills/generate-mdl/SKILL.md b/skills/generate-mdl/SKILL.md index 682e6ae42..0092c0a8f 100644 --- a/skills/generate-mdl/SKILL.md +++ b/skills/generate-mdl/SKILL.md @@ -1,10 +1,10 @@ --- name: generate-mdl description: Generate a Wren MDL manifest from a database using ibis-server metadata endpoints. Use when a user wants to create or set up a new Wren MDL, scaffold a manifest from an existing database, or onboard a new data source without installing any database drivers locally. -compatibility: Requires a running ibis-server and the Wren MCP server with tools: setup_connection, list_remote_tables, list_remote_constraints, mdl_validate_manifest, mdl_save_project, deploy_manifest +compatibility: Requires a running ibis-server (default port 8000). No local database drivers needed. metadata: author: wren-engine - version: "1.0" + version: "1.1" --- # Generate Wren MDL @@ -20,77 +20,119 @@ Follow these steps in order. Do not skip steps or ask unnecessary questions betw Ask the user for: 1. **Data source type** — one of: `POSTGRES`, `MYSQL`, `MSSQL`, `DUCKDB`, `BIGQUERY`, `SNOWFLAKE`, `CLICKHOUSE`, `TRINO`, `ATHENA`, `ORACLE`, `DATABRICKS` 2. **Connection credentials** — see [Connection info format](#connection-info-format) below +3. **Schema filter** (optional) — if the database has many schemas, ask which schema(s) to include -Do not ask for a SQLAlchemy connection string. Use the structured `conn_info` dict instead. +Do not ask for a SQLAlchemy connection string. Use the structured `connectionInfo` dict instead. -### Step 2 — Register connection +> **Important:** If the database runs on the host machine and ibis-server runs inside Docker, replace `localhost` / `127.0.0.1` with `host.docker.internal` in the host field. -``` -setup_connection(datasource=, conn_info=) -``` - -If this fails, report the error and ask the user to correct the credentials. +### Step 2 — Fetch table schema -### Step 3 — Fetch table schema +Call the ibis-server metadata endpoint directly: ``` -list_remote_tables() +POST http://localhost:8000/v3/connector//metadata/tables +Content-Type: application/json + +{ + "connectionInfo": { } +} ``` -ibis-server returns a list of tables with their column names and types. Parse the response to understand the full database schema. +ibis-server returns a list of tables with their column names and types. Each table entry has a `properties.schema` field — use it to filter to the user's target schema if specified. -### Step 4 — Fetch relationships +If this fails, report the error and ask the user to correct the credentials. + +### Step 3 — Fetch relationships ``` -list_remote_constraints() +POST http://localhost:8000/v3/connector//metadata/constraints +Content-Type: application/json + +{ + "connectionInfo": { } +} ``` -ibis-server returns foreign key constraints. Use these to build `Relationship` entries in the MDL. +Returns foreign key constraints. Use these to build `Relationship` entries in the MDL. If the response is empty (`[]`), infer relationships from column naming conventions (e.g. `order_id` → `orders.id`). -### Step 5 — Sample data (optional) +### Step 4 — Sample data (optional) -For columns where purpose is unclear from the name and type alone: +For columns where purpose is unclear from the name and type alone, query a few rows using the raw table name with schema prefix: ``` -query("SELECT * FROM LIMIT 3") +POST http://localhost:8000/v3/connector//query +Content-Type: application/json + +{ + "sql": "SELECT * FROM . LIMIT 3", + "manifestStr": "", + "connectionInfo": { } +} ``` -Note: use the raw table name (not model name) at this stage, since MDL is not yet deployed. +Note: use the raw `schema.table` reference at this stage, since the MDL is not yet deployed. -### Step 6 — Build MDL JSON +### Step 5 — Build MDL JSON Construct the manifest following the [MDL structure](#mdl-structure) below. Rules: - `catalog`: use `"wren"` unless the user specifies otherwise -- `schema`: use the database's default schema (e.g. `"public"` for PostgreSQL, `"dbo"` for MSSQL) +- `schema`: use the target schema name (e.g. `"public"` for PostgreSQL default, `"jaffle_shop"` if user specified) - `dataSource`: set to the enum value from Step 1 (e.g. `"POSTGRES"`) +- `tableReference.catalog`: set to the database name (not `"wren"`) - Each table → one `Model`. Set `tableReference.table` to the exact table name - Each column → one `Column`. Use the exact DB column name - Mark primary key columns with `"isPrimaryKey": true` and set `primaryKey` on the model - For FK columns, add a `Relationship` entry linking the two models - Omit calculated columns for now — they can be added later -### Step 7 — Validate +### Step 6 — Validate +Validate the MDL by running a dry-plan against a simple query. Base64-encode the manifest first: + +```python +import json, base64 +manifest_b64 = base64.b64encode(json.dumps(mdl).encode()).decode() ``` -mdl_validate_manifest(mdl=) + +Then call: + ``` +POST http://localhost:8000/v3/connector//dry-plan +Content-Type: application/json -This calls ibis-server's dry-plan endpoint. If validation fails, fix the reported errors and validate again. +{ + "manifestStr": "", + "sql": "SELECT * FROM LIMIT 1" +} +``` + +If validation succeeds, the response is the planned SQL string. If it fails, fix the reported errors and validate again. + +> **Note:** Use the `/v3/` endpoint, not `/v2/`. The v2 dry-plan requires a separate Wren Engine Java process (`WREN_ENGINE_ENDPOINT`) which is not part of the standard Docker setup. -### Step 8 — Save project (optional) +### Step 7 — Save project (optional) Ask the user if they want to save the MDL as a YAML project directory (useful for version control). -If yes, follow the **mdl-project** skill (`skills/mdl-project/SKILL.md`) to write the YAML files. +If yes, follow the **wren-project** skill (`skills/wren-project/SKILL.md`) to write the YAML files and build `target/mdl.json` + `target/connection.json`. -### Step 9 — Deploy +### Step 8 — Deploy + +**If Wren MCP tools are available** (i.e., Claude Code has the `wren` MCP server registered): ``` deploy_manifest(mdl=) ``` +**If MCP tools are not available**, deploy by writing the MDL to the workspace file that the container watches: + +1. Build `target/mdl.json` from the YAML project (see wren-project skill) +2. Ensure the container was started with `-e MDL_PATH=/workspace/target/mdl.json` +3. Restart the container to reload — or call the `deploy` MCP tool after connecting + Confirm success to the user. The MDL is now active and queries can run. --- diff --git a/skills/install.sh b/skills/install.sh index e0aba17d1..c1a2f0d42 100755 --- a/skills/install.sh +++ b/skills/install.sh @@ -13,7 +13,7 @@ set -euo pipefail REPO="Canner/wren-engine" BRANCH="${WREN_SKILLS_BRANCH:-main}" DEST="${CLAUDE_SKILLS_DIR:-$HOME/.claude/skills}" -ALL_SKILLS=(generate-mdl mdl-project wren-sql wren-quickstart) +ALL_SKILLS=(generate-mdl wren-project wren-sql wren-mcp-setup wren-quickstart) # Parse --force flag and skill list from arguments FORCE=false diff --git a/skills/wren-mcp-setup/SKILL.md b/skills/wren-mcp-setup/SKILL.md new file mode 100644 index 000000000..4efe8cfc4 --- /dev/null +++ b/skills/wren-mcp-setup/SKILL.md @@ -0,0 +1,382 @@ +--- +name: wren-mcp-setup +description: Set up Wren Engine MCP server via Docker and register it with an AI agent. Covers pulling the Docker image, running the container with docker run, mounting a workspace, fixing localhost → host.docker.internal for connection info, registering the MCP server in Claude Code (or other MCP clients) using streamable-http transport, and starting a new session to interact with Wren MCP. Trigger when a user wants to run Wren MCP in Docker, configure Claude Code MCP, or connect an AI client to a Dockerized Wren Engine. +compatibility: Requires Docker Desktop (or Docker Engine). +metadata: + author: wren-engine + version: "1.1" +--- + +# Set Up Wren MCP via Docker + +Runs the Wren Engine ibis-server + MCP server together in a single Docker container and connects it to Claude Code (or another MCP client) over streamable-http. + +--- + +## Step 1 — Ask for workspace path + +Ask the user: + +> What directory on your host machine should be mounted as the MCP workspace? +> This is where MDL files and YAML project directories will be read and written. +> (Example: `~/wren-workspace`) + +If the user has no preference, suggest creating a dedicated directory such as `~/wren-workspace`: + +```bash +mkdir -p ~/wren-workspace +``` + +Save the answer as `` (use the absolute path, e.g. `/Users/me/wren-workspace`) for use in the next steps. + +--- + +## Step 2 — Prepare workspace and start the container + +The workspace directory is mounted at `/workspace` inside the container. The container auto-loads the MDL and connection info at startup if you provide `MDL_PATH` and `CONNECTION_INFO_FILE` pointing to files inside the workspace. + +**Recommended workspace layout:** + +``` +/ +└── target/ + ├── mdl.json # Compiled MDL (from wren-project build) + └── connection.json # Connection info JSON +``` + +Run the following command, substituting `` with the path from Step 1: + +```bash +docker run -d \ + --name wren-mcp \ + -p 8000:8000 \ + -p 9000:9000 \ + -e ENABLE_MCP_SERVER=true \ + -e MCP_TRANSPORT=streamable-http \ + -e MCP_HOST=0.0.0.0 \ + -e MCP_PORT=9000 \ + -e WREN_URL=localhost:8000 \ + -e MDL_PATH=/workspace/target/mdl.json \ + -e CONNECTION_INFO_FILE=/workspace/target/connection.json \ + -v :/workspace \ + ghcr.io/canner/wren-engine-ibis:latest +``` + +Example with a concrete path: + +```bash +docker run -d \ + --name wren-mcp \ + -p 8000:8000 \ + -p 9000:9000 \ + -e ENABLE_MCP_SERVER=true \ + -e MCP_TRANSPORT=streamable-http \ + -e MCP_HOST=0.0.0.0 \ + -e MCP_PORT=9000 \ + -e WREN_URL=localhost:8000 \ + -e MDL_PATH=/workspace/target/mdl.json \ + -e CONNECTION_INFO_FILE=/workspace/target/connection.json \ + -v /Users/me/my-mdl-files:/workspace \ + ghcr.io/canner/wren-engine-ibis:latest +``` + +> If `MDL_PATH` or `CONNECTION_INFO_FILE` are not set (or the files don't exist yet), the container starts without a loaded MDL. You can deploy later using the `deploy` MCP tool. + +This starts the container using the image `ghcr.io/canner/wren-engine-ibis:latest` with: + +| Service | Port | Purpose | +|---------|------|---------| +| ibis-server | 8000 | REST API for query execution and metadata | +| mcp-server (streamable-http) | 9000 | MCP endpoint for AI clients | + +The workspace directory is mounted at `/workspace` inside the container. + +Verify the container is running: +```bash +docker ps --filter name=wren-mcp +docker logs -f wren-mcp +``` + +--- + +## Step 3 — Fix connection info: localhost → host.docker.internal + +**Critical:** The container cannot reach the host's `localhost` directly. + +If the user's database connection info references `localhost` or `127.0.0.1` as the host, it must be changed to `host.docker.internal` so the container can reach a database running on the host machine. + +**Examples:** + +| Original | Inside Docker | +|----------|--------------| +| `"host": "localhost"` | `"host": "host.docker.internal"` | +| `"host": "127.0.0.1"` | `"host": "host.docker.internal"` | +| Cloud/remote host (e.g. `mydb.us-east-1.rds.amazonaws.com`) | No change needed | + +When the user provides connection credentials later (via `setup_connection`), check the `host` field and warn if it is `localhost` or `127.0.0.1`. + +--- + +## Step 4 — Configure Claude Code MCP + +Claude Code uses **streamable-http** transport to connect to the containerized MCP server. + +Add to `~/.claude/settings.json` under `mcpServers`: + +```json +{ + "mcpServers": { + "wren": { + "type": "http", + "url": "http://localhost:9000/mcp" + } + } +} +``` + +**Via Claude Code CLI (recommended):** + +```bash +claude mcp add --transport http wren http://localhost:9000/mcp +``` + +After adding, **restart Claude Code** for the new MCP server to be loaded into the session. Confirm with: +```bash +claude mcp list +``` + +> **Note:** The config file is `~/.claude/settings.json`, not `~/.claude.json`. Adding the MCP server while a session is already running has no effect until the session is restarted. + +--- + +## Step 5 — Configure other MCP clients + +### Cline / Cursor / VS Code MCP Extension + +These clients also support HTTP transport. Add to their MCP settings: + +```json +{ + "mcpServers": { + "wren": { + "type": "streamable-http", + "url": "http://localhost:9000/mcp" + } + } +} +``` + +### Claude Desktop (stdio fallback) + +Claude Desktop does not support HTTP transport natively. Use a local stdio proxy or run the MCP server locally instead. + +--- + +## Step 6 — Verify the connection + +Ask the AI agent to run a health check: + +``` +Use the health_check() tool to verify Wren Engine is reachable. +``` + +Expected response: `SELECT 1` returns a successful result. + +If health check fails, see **Troubleshooting** below. + +--- + +## Troubleshooting — MCP server not healthy + +### 1. Check container status and logs + +```bash +docker ps --filter name=wren-mcp +docker logs wren-mcp +``` + +Look for startup errors or crash loops. If the container exited, `logs` will show the last output before it stopped. + +### 2. Port already in use + +The container exposes ports **8000** (ibis-server) and **9000** (MCP). If either port is already bound by another process on the host, the `docker run` command will fail with a bind error. + +**Check what is using the port:** + +```bash +# macOS / Linux +lsof -i :9000 +lsof -i :8000 + +# or with ss (Linux) +ss -tlnp | grep -E '8000|9000' +``` + +If another process is occupying the port you have two options: + +**Option A — stop the conflicting process:** +```bash +# macOS: kill by port +kill $(lsof -ti :9000) +``` + +**Option B — remap to different host ports:** + +Stop and remove the existing container first, then re-run with different host-side ports: +```bash +docker rm -f wren-mcp +docker run -d \ + --name wren-mcp \ + -p 18000:8000 \ + -p 19000:9000 \ + -e ENABLE_MCP_SERVER=true \ + -e MCP_TRANSPORT=streamable-http \ + -e MCP_HOST=0.0.0.0 \ + -e MCP_PORT=9000 \ + -e WREN_URL=localhost:8000 \ + -v :/workspace \ + ghcr.io/canner/wren-engine-ibis:latest +``` + +Then update the MCP client URL to match: +```bash +claude mcp add --transport http wren http://localhost:19000/mcp +``` + +### 3. Container started but MCP endpoint returns an error + +- Confirm the container was started with `-e ENABLE_MCP_SERVER=true`, `-e MCP_TRANSPORT=streamable-http`, and `-e MCP_HOST=0.0.0.0` +- Try curling the endpoint directly: + ```bash + curl -v http://localhost:9000/mcp + ``` + A `405 Method Not Allowed` response means the endpoint is reachable but expects a POST — that is normal and indicates the MCP server is up. + +### 4. Database connection refused inside the container + +If `health_check()` passes but queries fail with a connection error, the database host is likely still set to `localhost`. See **Step 3** above — change it to `host.docker.internal`. + +--- + +## Step 7 — Load an MDL and start querying + +**Option A — Auto-load at startup (recommended):** + +Place your compiled `mdl.json` in `/target/mdl.json` and `connection.json` in `/target/connection.json` before starting the container. The container reads `MDL_PATH` and `CONNECTION_INFO_FILE` at startup and logs: + +``` +Loaded MDL /workspace/target/mdl.json (9 models, 47 columns) +Loaded connection info /workspace/target/connection.json +``` + +**Option B — Deploy via MCP tool (after container is running):** + +In the AI client (after MCP is connected): + +``` +deploy(mdl_file_path="/workspace/target/mdl.json") +``` + +Or deploy from an in-memory dict: + +``` +deploy_manifest(mdl=) +``` + +**To generate an MDL from scratch**, use the `generate-mdl` skill. It will introspect your database schema via ibis-server and build the manifest for you. + +--- + +## Reference: connection info by data source + +When calling `setup_connection`, use these formats. **Replace `localhost` with `host.docker.internal`** if the database runs on your host machine. + +``` +POSTGRES : {"host": "host.docker.internal", "port": "5432", "user": "...", "password": "...", "database": "..."} +MYSQL : {"host": "host.docker.internal", "port": "3306", "user": "...", "password": "...", "database": "..."} +MSSQL : {"host": "host.docker.internal", "port": "1433", "user": "...", "password": "...", "database": "..."} +CLICKHOUSE : {"host": "host.docker.internal", "port": "8123", "user": "...", "password": "...", "database": "..."} +TRINO : {"host": "host.docker.internal", "port": "8080", "user": "...", "catalog": "...", "schema": "..."} +DUCKDB : {"path": "/workspace/.duckdb"} ← file must be inside workspace +BIGQUERY : {"project": "...", "dataset": "...", "credentials_base64": "..."} +SNOWFLAKE : {"account": "...", "user": "...", "password": "...", "database": "...", "schema": "..."} +``` + +For DuckDB: the database file must be placed in the mounted workspace directory so the container can access it at `/workspace/.duckdb`. + +--- + +## Step 8 — Register the MCP server with your AI agent + +Once the container is running and healthy (Steps 1–7), register the Wren MCP endpoint with your AI agent so it can call Wren tools in conversation. + +### Claude Code (primary example) + +**Option A — CLI (recommended):** + +```bash +claude mcp add --transport http wren http://localhost:9000/mcp +``` + +This registers the server under the name `wren` for the current project. To make it available globally (all projects): + +```bash +claude mcp add --transport http --scope user wren http://localhost:9000/mcp +``` + +**Option B — Edit `~/.claude/settings.json` directly:** + +```json +{ + "mcpServers": { + "wren": { + "type": "http", + "url": "http://localhost:9000/mcp" + } + } +} +``` + +**Verify the registration:** + +```bash +claude mcp list +``` + +You should see `wren` listed with status `connected` or `http`. + +> **Important:** Adding the MCP server while a session is already open has no effect. You must start a **new session** for Claude Code to load the server. + +### Other MCP clients + +| Client | Config key | Transport value | +|--------|-----------|-----------------| +| Cline / Cursor / VS Code MCP Extension | `mcpServers.wren` | `"type": "streamable-http"` | +| Claude Desktop | Not supported natively | Use a local stdio proxy instead | + +For Cline/Cursor, add to their MCP settings JSON: + +```json +{ + "mcpServers": { + "wren": { + "type": "streamable-http", + "url": "http://localhost:9000/mcp" + } + } +} +``` + +--- + +## You're ready! + +Start a **new Claude Code session** (or restart your MCP client). The Wren tools (`health_check`, `query`, `deploy`, `setup_connection`, etc.) are now available. + +Try it: + +``` +Use health_check() to verify the connection. +``` + +Once confirmed, you can start querying your data through Wren Engine — use `/generate-mdl` to scaffold an MDL from your database, or `/wren-sql` for help writing queries. diff --git a/skills/mdl-project/SKILL.md b/skills/wren-project/SKILL.md similarity index 99% rename from skills/mdl-project/SKILL.md rename to skills/wren-project/SKILL.md index 2239e2e37..2ea9471be 100644 --- a/skills/mdl-project/SKILL.md +++ b/skills/wren-project/SKILL.md @@ -1,5 +1,5 @@ --- -name: mdl-project +name: wren-project description: Save, load, and build Wren MDL manifests as YAML project directories for version control. Use when a user wants to persist an MDL as human-readable YAML files, load a YAML project back into MDL JSON, or compile a YAML project to a deployable mdl.json file. Also manages connection info stored in connection.yml and compiled to target/connection.json. metadata: author: wren-engine diff --git a/skills/wren-quickstart/SKILL.md b/skills/wren-quickstart/SKILL.md index 8b3460cf6..27e34f13f 100644 --- a/skills/wren-quickstart/SKILL.md +++ b/skills/wren-quickstart/SKILL.md @@ -1,269 +1,184 @@ --- name: wren-quickstart -description: Set up Wren Engine MCP server via Docker. Covers pulling the Docker image, running the container with docker run, mounting a workspace, fixing localhost → host.docker.internal for connection info, and registering the MCP server in Claude Code (or other MCP clients) using streamable-http transport. Trigger when a user wants to run Wren MCP in Docker, configure Claude Code MCP, or connect an AI client to a Dockerized Wren Engine. -compatibility: Requires Docker Desktop (or Docker Engine). +description: End-to-end quickstart for Wren Engine — from zero to querying. Guides the user through installing skills, creating a workspace, generating an MDL from a live database, saving it as a versioned project, starting the Wren MCP Docker container, and verifying the setup with a health check. Trigger when a user wants to set up Wren Engine from scratch, onboard a new data source, or get started with Wren MCP. +compatibility: Requires Docker Desktop (or Docker Engine). No local database drivers needed. metadata: author: wren-engine version: "1.0" --- -# Set Up Wren MCP via Docker +# Wren Quickstart -Runs the Wren Engine ibis-server + MCP server together in a single Docker container and connects it to Claude Code (or another MCP client) over streamable-http. +This skill walks a user through setting up Wren Engine end-to-end — from installing the required skills to running their first query via MCP. Each phase delegates to a focused skill. Follow the steps in order. --- -## Step 1 — Ask for workspace path +## Phase 1 — Install skills -Ask the user: +Before the workflow can proceed, the user needs the dependent skills installed locally. -> What directory on your host machine should be mounted as the MCP workspace? -> This is where MDL files and YAML project directories will be read and written. -> (Example: `~/wren-workspace`) - -If the user has no preference, suggest creating a dedicated directory such as `~/wren-workspace`: +Tell the user to run the install script once: ```bash -mkdir -p ~/wren-workspace +# From a local clone: +bash skills/install.sh + +# Or remotely (no clone required): +curl -fsSL https://raw.githubusercontent.com/Canner/wren-engine/main/skills/install.sh | bash ``` -Save the answer as `` (use the absolute path, e.g. `/Users/me/wren-workspace`) for use in the next steps. +This installs all Wren skills (`generate-mdl`, `wren-project`, `wren-mcp-setup`, `wren-quickstart`) into `~/.claude/skills/`. ---- +After installation, the user should **restart their AI client session** so the new skills are loaded. -## Step 2 — Start the container +> If the user only wants specific skills, they can pass names as arguments: +> ```bash +> bash skills/install.sh generate-mdl wren-project wren-mcp-setup +> ``` -Run the following command, substituting `` with the path from Step 1: +--- -```bash -docker run -d \ - --name wren-mcp \ - -p 8000:8000 \ - -p 9000:9000 \ - -e ENABLE_MCP_SERVER=true \ - -e MCP_TRANSPORT=streamable-http \ - -e MCP_HOST=0.0.0.0 \ - -e MCP_PORT=9000 \ - -e WREN_URL=localhost:8000 \ - -v :/workspace \ - ghcr.io/canner/wren-engine-ibis:latest -``` +## Phase 2 — Create a workspace -Example with a concrete path: +Create a dedicated workspace directory on the host machine. This directory will be mounted into the Docker container, so the container can read and write MDL files. + +Ask the user where they want the workspace, or suggest a default: ```bash -docker run -d \ - --name wren-mcp \ - -p 8000:8000 \ - -p 9000:9000 \ - -e ENABLE_MCP_SERVER=true \ - -e MCP_TRANSPORT=streamable-http \ - -e MCP_HOST=0.0.0.0 \ - -e MCP_PORT=9000 \ - -e WREN_URL=localhost:8000 \ - -v /Users/me/my-mdl-files:/workspace \ - ghcr.io/canner/wren-engine-ibis:latest +mkdir -p ~/wren-workspace ``` -This starts the container using the image `ghcr.io/canner/wren-engine-ibis:latest` with: - -| Service | Port | Purpose | -|---------|------|---------| -| ibis-server | 8000 | REST API for query execution and metadata | -| mcp-server (streamable-http) | 9000 | MCP endpoint for AI clients | +Save the chosen path as `` (absolute path, e.g. `/Users/me/wren-workspace`). All subsequent steps reference this path. -The workspace directory is mounted at `/workspace` inside the container. +Recommended workspace layout after the quickstart completes: -Verify the container is running: -```bash -docker ps --filter name=wren-mcp -docker logs -f wren-mcp +``` +/ +├── wren_project.yml +├── models/ +│ └── *.yml +├── relationships.yml +├── views.yml +├── connection.yml +└── target/ + ├── mdl.json # Compiled MDL — loaded by Docker container + └── connection.json # Connection info — loaded by Docker container ``` --- -## Step 3 — Fix connection info: localhost → host.docker.internal - -**Critical:** The container cannot reach the host's `localhost` directly. +## Phase 3 — Generate MDL and save project -If the user's database connection info references `localhost` or `127.0.0.1` as the host, it must be changed to `host.docker.internal` so the container can reach a database running on the host machine. +### 3a — Generate MDL -**Examples:** +Invoke the **generate-mdl** skill to introspect the user's database and build the MDL manifest: -| Original | Inside Docker | -|----------|--------------| -| `"host": "localhost"` | `"host": "host.docker.internal"` | -| `"host": "127.0.0.1"` | `"host": "host.docker.internal"` | -| Cloud/remote host (e.g. `mydb.us-east-1.rds.amazonaws.com`) | No change needed | - -When the user provides connection credentials later (via `setup_connection`), check the `host` field and warn if it is `localhost` or `127.0.0.1`. +``` +@generate-mdl +``` ---- +The generate-mdl skill will: +1. Ask for data source type and connection credentials +2. Call ibis-server to fetch table schema and foreign key constraints +3. Build the MDL JSON (models, columns, relationships) +4. Validate the manifest with a dry-plan -## Step 4 — Configure Claude Code MCP +> **Important:** At this stage ibis-server may not be running yet. If the user has not started a container, proceed to Phase 4 first (start the container), then come back to generate the MDL using the running ibis-server on port 8000. +> +> Alternatively, if the user already has a running ibis-server, run Phase 3 before Phase 4. -Claude Code uses **streamable-http** transport to connect to the containerized MCP server. +### 3b — Save as YAML project -Add to `~/.claude.json` under `mcpServers` (or run `claude mcp add`): +After the MDL is generated, invoke the **wren-project** skill to save it as a versioned YAML project inside the workspace: -```json -{ - "mcpServers": { - "wren": { - "type": "http", - "url": "http://localhost:9000/mcp" - } - } -} ``` - -**Via Claude Code CLI (recommended):** - -```bash -claude mcp add --transport http wren http://localhost:9000/mcp +@wren-project ``` -After adding, restart Claude Code or reload the MCP server list. Confirm with: -```bash -claude mcp list -``` - ---- - -## Step 5 — Configure other MCP clients +Direct the skill to write the project files into ``: -### Cline / Cursor / VS Code MCP Extension +- `/wren_project.yml` +- `/models/*.yml` +- `/relationships.yml` +- `/views.yml` +- `/connection.yml` -These clients also support HTTP transport. Add to their MCP settings: +Then build the compiled targets: -```json -{ - "mcpServers": { - "wren": { - "type": "streamable-http", - "url": "http://localhost:9000/mcp" - } - } -} -``` +- `/target/mdl.json` +- `/target/connection.json` -### Claude Desktop (stdio fallback) - -Claude Desktop does not support HTTP transport natively. Use a local stdio proxy or run the MCP server locally instead. +The Docker container will auto-load these files at startup. --- -## Step 6 — Verify the connection +## Phase 4 — Start and register the MCP server -Ask the AI agent to run a health check: +Invoke the **wren-mcp-setup** skill to start the Docker container and register the MCP server with the AI client: ``` -Use the health_check() tool to verify Wren Engine is reachable. +@wren-mcp-setup ``` -Expected response: `SELECT 1` returns a successful result. +Pass `` as the workspace mount path when the skill asks. + +The wren-mcp-setup skill will: +1. Start the container with `-v :/workspace` +2. Set `MDL_PATH=/workspace/target/mdl.json` and `CONNECTION_INFO_FILE=/workspace/target/connection.json` +3. Register the MCP server with the AI client (`claude mcp add`) +4. Verify the container is running -If health check fails, see **Troubleshooting** below. +> If the MDL files already exist in `/target/` before the container starts, they are loaded automatically at boot. No separate `deploy` call is needed. --- -## Troubleshooting — MCP server not healthy +## Phase 5 — Verify and confirm -### 1. Check container status and logs +Once the MCP server is registered, ask the AI agent to run a health check in the current session: -```bash -docker ps --filter name=wren-mcp -docker logs wren-mcp ``` - -Look for startup errors or crash loops. If the container exited, `logs` will show the last output before it stopped. - -### 2. Port already in use - -The container exposes ports **8000** (ibis-server) and **9000** (MCP). If either port is already bound by another process on the host, the `docker run` command will fail with a bind error. - -**Check what is using the port:** - -```bash -# macOS / Linux -lsof -i :9000 -lsof -i :8000 - -# or with ss (Linux) -ss -tlnp | grep -E '8000|9000' +Use health_check() to verify Wren Engine is reachable. ``` -If another process is occupying the port you have two options: +Expected response: `SELECT 1` returns successfully. -**Option A — stop the conflicting process:** -```bash -# macOS: kill by port -kill $(lsof -ti :9000) -``` +If the health check passes: -**Option B — remap to different host ports:** +- Tell the user setup is complete. +- Remind them to **start a new session** so the Wren MCP tools are fully loaded. +- In the new session, they can start querying immediately: -Stop and remove the existing container first, then re-run with different host-side ports: -```bash -docker rm -f wren-mcp -docker run -d \ - --name wren-mcp \ - -p 18000:8000 \ - -p 19000:9000 \ - -e ENABLE_MCP_SERVER=true \ - -e MCP_TRANSPORT=streamable-http \ - -e MCP_HOST=0.0.0.0 \ - -e MCP_PORT=9000 \ - -e WREN_URL=localhost:8000 \ - -v :/workspace \ - ghcr.io/canner/wren-engine-ibis:latest ``` - -Then update the MCP client URL to match: -```bash -claude mcp add --transport http wren http://localhost:19000/mcp +Query: How many orders are in the orders table? ``` -### 3. Container started but MCP endpoint returns an error - -- Confirm the container was started with `-e ENABLE_MCP_SERVER=true`, `-e MCP_TRANSPORT=streamable-http`, and `-e MCP_HOST=0.0.0.0` -- Try curling the endpoint directly: - ```bash - curl -v http://localhost:9000/mcp - ``` - A `405 Method Not Allowed` response means the endpoint is reachable but expects a POST — that is normal and indicates the MCP server is up. - -### 4. Database connection refused inside the container - -If `health_check()` passes but queries fail with a connection error, the database host is likely still set to `localhost`. See **Step 3** above — change it to `host.docker.internal`. +If the health check fails, follow the troubleshooting steps in the **wren-mcp-setup** skill. --- -## Step 7 — Load an MDL and start querying - -Place your MDL JSON file in the workspace directory, then in the AI client: +## Quick reference — skill invocations -``` -deploy(mdl_file_path="/workspace/my_mdl.json") -``` - -Or generate a new MDL from scratch using the `generate-mdl` skill. +| Phase | Skill | Purpose | +|-------|-------|---------| +| 3a | `@generate-mdl` | Introspect database and build MDL JSON | +| 3b | `@wren-project` | Save MDL as YAML project + compile to `target/` | +| 4 | `@wren-mcp-setup` | Start Docker container and register MCP server | --- -## Reference: connection info by data source +## Troubleshooting -When calling `setup_connection`, use these formats. **Replace `localhost` with `host.docker.internal`** if the database runs on your host machine. +**Container not finding MDL at startup:** +- Confirm `/target/mdl.json` exists before starting the container. +- Check container logs: `docker logs wren-mcp` -``` -POSTGRES : {"host": "host.docker.internal", "port": "5432", "user": "...", "password": "...", "database": "..."} -MYSQL : {"host": "host.docker.internal", "port": "3306", "user": "...", "password": "...", "database": "..."} -MSSQL : {"host": "host.docker.internal", "port": "1433", "user": "...", "password": "...", "database": "..."} -CLICKHOUSE : {"host": "host.docker.internal", "port": "8123", "user": "...", "password": "...", "database": "..."} -TRINO : {"host": "host.docker.internal", "port": "8080", "user": "...", "catalog": "...", "schema": "..."} -DUCKDB : {"path": "/workspace/.duckdb"} ← file must be inside workspace -BIGQUERY : {"project": "...", "dataset": "...", "credentials_base64": "..."} -SNOWFLAKE : {"account": "...", "user": "...", "password": "...", "database": "...", "schema": "..."} -``` +**generate-mdl fails because ibis-server is not yet running:** +- Start the container first (Phase 4), then return to Phase 3. +- ibis-server is available at `http://localhost:8000` once the container is up. + +**MCP tools not available after registration:** +- The MCP server is only loaded at session start. Start a new Claude Code session after registering. -For DuckDB: the database file must be placed in the mounted workspace directory so the container can access it at `/workspace/.duckdb`. +**Database connection refused inside Docker:** +- Change `localhost` / `127.0.0.1` to `host.docker.internal` in connection credentials. +- See the **wren-mcp-setup** skill for the full localhost fix. From d135d15df367f0b3d982d8973f21c67d6bd8d110 Mon Sep 17 00:00:00 2001 From: Jax Liu Date: Fri, 6 Mar 2026 14:50:35 +0800 Subject: [PATCH 2/5] update readme --- mcp-server/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/mcp-server/README.md b/mcp-server/README.md index e1b284164..db688ec3c 100644 --- a/mcp-server/README.md +++ b/mcp-server/README.md @@ -80,7 +80,7 @@ deploy(mdl_file_path="/workspace/my_mdl.json") ``` **Output — save a generated MDL:** -The agent writes YAML project files and compiled MDL JSON back to your host via the `mdl-project` skill: +The agent writes YAML project files and compiled MDL JSON back to your host via the `wren-project` skill: ``` # agent writes: mcp-server/workspace/my_project/wren_project.yml, models/*.yml, ... # agent compiles to: mcp-server/workspace/my_project/target/mdl.json From cc90d02bf6cb88057a0a753b6be4f668255bbbd6 Mon Sep 17 00:00:00 2001 From: Jax Liu Date: Fri, 6 Mar 2026 15:05:16 +0800 Subject: [PATCH 3/5] feat(skills): add auto-update notification mechanism Each SKILL.md now checks versions.json on invocation and notifies the user if a newer version is available, so updates are surfaced at the natural moment of use rather than requiring manual checking. - Add skills/versions.json as the lightweight version registry - Add ## Version check section to all 5 SKILL.md files - Add update reminder to install.sh post-install output - Document update and release workflow in skills/README.md Co-Authored-By: Claude Sonnet 4.6 --- skills/README.md | 34 +++++++++++++++++++++++++++++++++ skills/generate-mdl/SKILL.md | 16 ++++++++++++++++ skills/install.sh | 4 ++++ skills/versions.json | 7 +++++++ skills/wren-mcp-setup/SKILL.md | 16 ++++++++++++++++ skills/wren-project/SKILL.md | 16 ++++++++++++++++ skills/wren-quickstart/SKILL.md | 16 ++++++++++++++++ skills/wren-sql/SKILL.md | 16 ++++++++++++++++ 8 files changed, 125 insertions(+) create mode 100644 skills/versions.json diff --git a/skills/README.md b/skills/README.md index 2763a2c68..7680b6e5c 100644 --- a/skills/README.md +++ b/skills/README.md @@ -57,6 +57,40 @@ Once installed, invoke a skill by name in your conversation: See [SKILLS.md](SKILLS.md) for full details on each skill. +## Updating Skills + +Each skill automatically checks for updates when invoked. If a newer version is available, the AI agent will notify you with the update command before continuing. + +To update manually at any time: + +```bash +# Update all skills +curl -fsSL https://raw.githubusercontent.com/Canner/wren-engine/main/skills/install.sh | bash -s -- --force + +# Update a specific skill +curl -fsSL https://raw.githubusercontent.com/Canner/wren-engine/main/skills/install.sh | bash -s -- --force generate-mdl +``` + +## Releasing a New Skill Version + +When updating a skill, two files must be kept in sync: + +1. Update `version` in the skill's `SKILL.md` frontmatter: + ```yaml + metadata: + author: wren-engine + version: "1.2" # bump this + ``` + +2. Update the matching entry in [`versions.json`](versions.json): + ```json + { + "generate-mdl": "1.2" + } + ``` + +Both files must have the same version number. The `SKILL.md` version is what users have installed locally; `versions.json` is what the update check compares against. + ## Requirements - A running [ibis-server](../ibis-server/) instance diff --git a/skills/generate-mdl/SKILL.md b/skills/generate-mdl/SKILL.md index 0092c0a8f..38b109e03 100644 --- a/skills/generate-mdl/SKILL.md +++ b/skills/generate-mdl/SKILL.md @@ -9,6 +9,22 @@ metadata: # Generate Wren MDL +## Version check + +Silently fetch `https://raw.githubusercontent.com/Canner/wren-engine/main/skills/versions.json`. +Compare the `generate-mdl` key with this skill's version (`1.1`). +If the remote version is newer, notify the user before proceeding: + +> A newer version of the **generate-mdl** skill is available (remote: X.Y, installed: 1.1). +> Update with: +> ```bash +> curl -fsSL https://raw.githubusercontent.com/Canner/wren-engine/main/skills/install.sh | bash -s -- --force generate-mdl +> ``` + +Then continue with the workflow below regardless of update status. + +--- + Generates a Wren MDL manifest by using ibis-server to introspect the database schema — no local database drivers required. All schema discovery goes through ibis-server, which already has drivers for all supported data sources. ## Workflow diff --git a/skills/install.sh b/skills/install.sh index c1a2f0d42..59b33db4a 100755 --- a/skills/install.sh +++ b/skills/install.sh @@ -113,3 +113,7 @@ echo "Done. Invoke skills in your AI client:" for skill in "${SELECTED_SKILLS[@]}"; do echo " /$skill" done +echo "" +echo "To update skills later, re-run with --force:" +echo " curl -fsSL https://raw.githubusercontent.com/Canner/wren-engine/main/skills/install.sh | bash -s -- --force" +echo "Or check for updates: each skill notifies you automatically when a newer version is available." diff --git a/skills/versions.json b/skills/versions.json new file mode 100644 index 000000000..6258b9ac6 --- /dev/null +++ b/skills/versions.json @@ -0,0 +1,7 @@ +{ + "generate-mdl": "1.1", + "wren-project": "1.1", + "wren-sql": "1.0", + "wren-mcp-setup": "1.1", + "wren-quickstart": "1.0" +} diff --git a/skills/wren-mcp-setup/SKILL.md b/skills/wren-mcp-setup/SKILL.md index 4efe8cfc4..2ed0c48c6 100644 --- a/skills/wren-mcp-setup/SKILL.md +++ b/skills/wren-mcp-setup/SKILL.md @@ -9,6 +9,22 @@ metadata: # Set Up Wren MCP via Docker +## Version check + +Silently fetch `https://raw.githubusercontent.com/Canner/wren-engine/main/skills/versions.json`. +Compare the `wren-mcp-setup` key with this skill's version (`1.1`). +If the remote version is newer, notify the user before proceeding: + +> A newer version of the **wren-mcp-setup** skill is available (remote: X.Y, installed: 1.1). +> Update with: +> ```bash +> curl -fsSL https://raw.githubusercontent.com/Canner/wren-engine/main/skills/install.sh | bash -s -- --force wren-mcp-setup +> ``` + +Then continue with the workflow below regardless of update status. + +--- + Runs the Wren Engine ibis-server + MCP server together in a single Docker container and connects it to Claude Code (or another MCP client) over streamable-http. --- diff --git a/skills/wren-project/SKILL.md b/skills/wren-project/SKILL.md index 2ea9471be..177ca1b13 100644 --- a/skills/wren-project/SKILL.md +++ b/skills/wren-project/SKILL.md @@ -8,6 +8,22 @@ metadata: # MDL Project +## Version check + +Silently fetch `https://raw.githubusercontent.com/Canner/wren-engine/main/skills/versions.json`. +Compare the `wren-project` key with this skill's version (`1.1`). +If the remote version is newer, notify the user before proceeding: + +> A newer version of the **wren-project** skill is available (remote: X.Y, installed: 1.1). +> Update with: +> ```bash +> curl -fsSL https://raw.githubusercontent.com/Canner/wren-engine/main/skills/install.sh | bash -s -- --force wren-project +> ``` + +Then continue with the workflow below regardless of update status. + +--- + A Wren MDL project is a directory of YAML files — one file per model — that makes MDL manifests human-readable and version-control friendly (similar to dbt projects). YAML files use **snake_case** field names for readability. The compiled `target/mdl.json` uses **camelCase** (the wire format expected by ibis-server). The conversion is documented in [Field mapping](#field-mapping). diff --git a/skills/wren-quickstart/SKILL.md b/skills/wren-quickstart/SKILL.md index 27e34f13f..fe0a1b27b 100644 --- a/skills/wren-quickstart/SKILL.md +++ b/skills/wren-quickstart/SKILL.md @@ -9,6 +9,22 @@ metadata: # Wren Quickstart +## Version check + +Silently fetch `https://raw.githubusercontent.com/Canner/wren-engine/main/skills/versions.json`. +Compare the `wren-quickstart` key with this skill's version (`1.0`). +If the remote version is newer, notify the user before proceeding: + +> A newer version of the **wren-quickstart** skill is available (remote: X.Y, installed: 1.0). +> Update with: +> ```bash +> curl -fsSL https://raw.githubusercontent.com/Canner/wren-engine/main/skills/install.sh | bash -s -- --force wren-quickstart +> ``` + +Then continue with the workflow below regardless of update status. + +--- + This skill walks a user through setting up Wren Engine end-to-end — from installing the required skills to running their first query via MCP. Each phase delegates to a focused skill. Follow the steps in order. --- diff --git a/skills/wren-sql/SKILL.md b/skills/wren-sql/SKILL.md index 891864637..091ac9c50 100644 --- a/skills/wren-sql/SKILL.md +++ b/skills/wren-sql/SKILL.md @@ -9,6 +9,22 @@ metadata: # Wren SQL +## Version check + +Silently fetch `https://raw.githubusercontent.com/Canner/wren-engine/main/skills/versions.json`. +Compare the `wren-sql` key with this skill's version (`1.0`). +If the remote version is newer, notify the user before proceeding: + +> A newer version of the **wren-sql** skill is available (remote: X.Y, installed: 1.0). +> Update with: +> ```bash +> curl -fsSL https://raw.githubusercontent.com/Canner/wren-engine/main/skills/install.sh | bash -s -- --force wren-sql +> ``` + +Then continue with the workflow below regardless of update status. + +--- + Wren Engine translates SQL through a semantic layer (MDL — Model Definition Language) before executing it against a backend database. SQL must target MDL model names, not raw database tables. For specific topics, load the relevant reference file: From 469d888724eda3f206fa5c5ce9582e2b43a0f053 Mon Sep 17 00:00:00 2001 From: Jax Liu Date: Fri, 6 Mar 2026 15:09:26 +0800 Subject: [PATCH 4/5] fix(skills): address CodeRabbit review issues MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - Remove duplicate Step 8 from wren-mcp-setup (MCP registration already covered in Steps 4/5; placing it after Steps 6/7 that require MCP tools was logically incorrect) - Fix wren-quickstart Phase 5 to require a new session before running health_check() — MCP tools only load on session start - Add missing wren-sql to installed skills list in wren-quickstart - Add /wren-mcp-setup and /wren-quickstart to SKILLS.md invocation examples - Update stale docker-compose workflow summary in SKILLS.md to reflect the current docker run approach Co-Authored-By: Claude Sonnet 4.6 --- skills/SKILLS.md | 10 +++--- skills/wren-mcp-setup/SKILL.md | 63 --------------------------------- skills/wren-quickstart/SKILL.md | 9 ++--- 3 files changed, 11 insertions(+), 71 deletions(-) diff --git a/skills/SKILLS.md b/skills/SKILLS.md index e7bf451ba..bf7a1b2ad 100644 --- a/skills/SKILLS.md +++ b/skills/SKILLS.md @@ -159,10 +159,10 @@ Sets up Wren Engine MCP server via Docker, registers it with an AI agent (Claude ### Workflow summary 1. Ask user for workspace mount path -2. Create `docker/.env` with `MDL_WORKSPACE` -3. `docker compose up -d` -4. Rewrite `localhost` → `host.docker.internal` in connection credentials -5. Add `wren` MCP server to Claude Code using streamable-http on port 9000 +2. `docker run` with workspace mounted at `/workspace`, MCP server enabled on port 9000 +3. Rewrite `localhost` → `host.docker.internal` in connection credentials +4. Add `wren` MCP server to Claude Code using streamable-http on port 9000 (`claude mcp add`) +5. Start a new session so the MCP tools are loaded 6. Run `health_check()` to verify --- @@ -183,4 +183,6 @@ Then invoke in your AI client: /generate-mdl /wren-project /wren-sql +/wren-mcp-setup +/wren-quickstart ``` diff --git a/skills/wren-mcp-setup/SKILL.md b/skills/wren-mcp-setup/SKILL.md index 2ed0c48c6..cc4740e0f 100644 --- a/skills/wren-mcp-setup/SKILL.md +++ b/skills/wren-mcp-setup/SKILL.md @@ -322,69 +322,6 @@ For DuckDB: the database file must be placed in the mounted workspace directory --- -## Step 8 — Register the MCP server with your AI agent - -Once the container is running and healthy (Steps 1–7), register the Wren MCP endpoint with your AI agent so it can call Wren tools in conversation. - -### Claude Code (primary example) - -**Option A — CLI (recommended):** - -```bash -claude mcp add --transport http wren http://localhost:9000/mcp -``` - -This registers the server under the name `wren` for the current project. To make it available globally (all projects): - -```bash -claude mcp add --transport http --scope user wren http://localhost:9000/mcp -``` - -**Option B — Edit `~/.claude/settings.json` directly:** - -```json -{ - "mcpServers": { - "wren": { - "type": "http", - "url": "http://localhost:9000/mcp" - } - } -} -``` - -**Verify the registration:** - -```bash -claude mcp list -``` - -You should see `wren` listed with status `connected` or `http`. - -> **Important:** Adding the MCP server while a session is already open has no effect. You must start a **new session** for Claude Code to load the server. - -### Other MCP clients - -| Client | Config key | Transport value | -|--------|-----------|-----------------| -| Cline / Cursor / VS Code MCP Extension | `mcpServers.wren` | `"type": "streamable-http"` | -| Claude Desktop | Not supported natively | Use a local stdio proxy instead | - -For Cline/Cursor, add to their MCP settings JSON: - -```json -{ - "mcpServers": { - "wren": { - "type": "streamable-http", - "url": "http://localhost:9000/mcp" - } - } -} -``` - ---- - ## You're ready! Start a **new Claude Code session** (or restart your MCP client). The Wren tools (`health_check`, `query`, `deploy`, `setup_connection`, etc.) are now available. diff --git a/skills/wren-quickstart/SKILL.md b/skills/wren-quickstart/SKILL.md index fe0a1b27b..aeea880dc 100644 --- a/skills/wren-quickstart/SKILL.md +++ b/skills/wren-quickstart/SKILL.md @@ -43,7 +43,7 @@ bash skills/install.sh curl -fsSL https://raw.githubusercontent.com/Canner/wren-engine/main/skills/install.sh | bash ``` -This installs all Wren skills (`generate-mdl`, `wren-project`, `wren-mcp-setup`, `wren-quickstart`) into `~/.claude/skills/`. +This installs all Wren skills (`generate-mdl`, `wren-project`, `wren-sql`, `wren-mcp-setup`, `wren-quickstart`) into `~/.claude/skills/`. After installation, the user should **restart their AI client session** so the new skills are loaded. @@ -150,7 +150,9 @@ The wren-mcp-setup skill will: ## Phase 5 — Verify and confirm -Once the MCP server is registered, ask the AI agent to run a health check in the current session: +Once the MCP server is registered, the user must **start a new session** for the Wren MCP tools to be loaded. Instruct the user to do this now. + +In the new session, ask the AI agent to run a health check: ``` Use health_check() to verify Wren Engine is reachable. @@ -161,8 +163,7 @@ Expected response: `SELECT 1` returns successfully. If the health check passes: - Tell the user setup is complete. -- Remind them to **start a new session** so the Wren MCP tools are fully loaded. -- In the new session, they can start querying immediately: +- In this session, they can start querying immediately: ``` Query: How many orders are in the orders table? From 1348187d1aaa39d24e4b42248062c5a587d7fe27 Mon Sep 17 00:00:00 2001 From: Jax Liu Date: Fri, 6 Mar 2026 15:14:35 +0800 Subject: [PATCH 5/5] =?UTF-8?q?fix(skills):=20normalize=20conn=5Finfo=20?= =?UTF-8?q?=E2=86=92=20connectionInfo=20in=20generate-mdl?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Consistent naming across the entire document to avoid confusing agents. Co-Authored-By: Claude Sonnet 4.6 --- skills/generate-mdl/SKILL.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/skills/generate-mdl/SKILL.md b/skills/generate-mdl/SKILL.md index 38b109e03..71c3a2e17 100644 --- a/skills/generate-mdl/SKILL.md +++ b/skills/generate-mdl/SKILL.md @@ -246,7 +246,7 @@ When in doubt, use `VARCHAR` as a safe fallback. ## Connection info format -Pass to `setup_connection(datasource=..., conn_info={...})`: +Pass to `setup_connection(datasource=..., connectionInfo={...})`: ``` POSTGRES : {"host": "...", "port": "5432", "user": "...", "password": "...", "database": "..."}