Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions docs/getting_started_with_claude_code.md
Original file line number Diff line number Diff line change
Expand Up @@ -277,6 +277,9 @@ Confirm `~/wren-workspace/target/mdl.json` exists before starting the container.
**Database connection refused inside Docker:**
Change `localhost` / `127.0.0.1` to `host.docker.internal` in your connection credentials.

**MCP tools fail with "Session not found" after container restart:**
Start a new Claude Code session. Container restarts invalidate MCP sessions — the client must reconnect.

**`wren-generate-mdl` fails because wren-ibis-server is not running:**
Start the container first (Phase 2), then run `/wren-generate-mdl`. wren-ibis-server is available at `http://localhost:8000` once the container is up.

Expand Down
39 changes: 39 additions & 0 deletions mcp-server/app/templates/_fields.html
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,22 @@
{% for field in fields %}
{% if field.type == 'hidden' %}
<input type="hidden" name="{{ field.name }}" value="{{ field.get('value', '') }}">
{% elif field.type == 'file_base64' %}
<label>
{{ field.label }}
<input type="hidden" name="{{ field.name }}" id="fb64-{{ field.name }}" value="">
<input type="file" id="fb64-file-{{ field.name }}"
{% if field.get('accept') %}accept="{{ field.accept }}"{% endif %}
onchange="handleFileBase64(this, '{{ field.name }}')">
{% if connection_info.get(field.name) %}
<small id="fb64-status-{{ field.name }}" style="color:var(--pico-primary)">✓ Credentials already configured (upload new file to replace)</small>
{% else %}
<small id="fb64-status-{{ field.name }}"></small>
{% endif %}
{% if field.get('hint') %}
<small style="color:var(--pico-muted-color)">{{ field.hint }}</small>
{% endif %}
</label>
{% else %}
<label>
{{ field.label }}
Expand All @@ -21,6 +37,29 @@
</label>
{% endif %}
{% endfor %}
<script>
function handleFileBase64(fileInput, fieldName) {
var hidden = document.getElementById('fb64-' + fieldName);
var status = document.getElementById('fb64-status-' + fieldName);
var file = fileInput.files[0];
if (!file) { hidden.value = ''; status.textContent = ''; return; }
var reader = new FileReader();
reader.onload = function() {
var bytes = new Uint8Array(reader.result);
var binary = '';
for (var i = 0; i < bytes.length; i++) binary += String.fromCharCode(bytes[i]);
hidden.value = btoa(binary);
status.textContent = '✓ ' + file.name + ' loaded (' + Math.ceil(bytes.length / 1024) + ' KB)';
status.style.color = 'var(--pico-primary)';
};
reader.onerror = function() {
hidden.value = '';
status.textContent = '✗ Failed to read file';
status.style.color = 'var(--pico-color-red-500)';
};
reader.readAsArrayBuffer(file);
}
</script>
{% elif datasource %}
<label>
Connection JSON
Expand Down
18 changes: 15 additions & 3 deletions mcp-server/app/web.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,9 +42,9 @@
{"name": "format", "label": "Format", "type": "hidden", "value": "duckdb"},
],
"BIGQUERY": [
{"name": "project", "label": "Project ID", "type": "text", "placeholder": "my-gcp-project"},
{"name": "dataset", "label": "Dataset", "type": "text", "placeholder": "my_dataset"},
{"name": "credentials_base64", "label": "Credentials (Base64)", "type": "password", "placeholder": ""},
{"name": "project_id", "label": "Project ID", "type": "text", "placeholder": "my-gcp-project"},
{"name": "dataset_id", "label": "Dataset", "type": "text", "placeholder": "my_dataset"},
{"name": "credentials", "label": "Service Account JSON", "type": "file_base64", "accept": ".json", "hint": "Upload your GCP service account credentials.json file. It will be base64-encoded automatically."},
],
"SNOWFLAKE": [
{"name": "user", "label": "User", "type": "text", "placeholder": ""},
Expand Down Expand Up @@ -131,6 +131,11 @@
{"name": "secret_key", "label": "Secret Key", "type": "password", "placeholder": ""},
{"name": "credentials", "label": "Credentials (Base64)", "type": "password", "placeholder": "eyJ..."},
],
"DATABRICKS": [
{"name": "serverHostname", "label": "Server Hostname", "type": "text", "placeholder": "dbc-xxxxxxxx-xxxx.cloud.databricks.com"},
{"name": "httpPath", "label": "HTTP Path", "type": "text", "placeholder": "/sql/1.0/warehouses/xxxxxxxx"},
{"name": "accessToken", "label": "Access Token", "type": "password", "placeholder": ""},
],
}

# Callbacks injected by wren.py via init()
Expand Down Expand Up @@ -213,6 +218,13 @@ async def post_connection(request: Request):
return HTMLResponse(_msg("✗ Please select a data source.", ok=False))

state = _get_state()

# Merge with existing connection info so that omitted sensitive fields
# (e.g. credentials not re-uploaded) retain their saved values.
existing = state.get("connection_info") or {}
if existing:
merged = {**existing, **conn_info}
conn_info = merged
mdl_ds = (state.get("data_source") or "").upper()
if state.get("is_deployed") and mdl_ds and mdl_ds != ds:
return HTMLResponse(
Expand Down
19 changes: 12 additions & 7 deletions mcp-server/app/wren.py
Original file line number Diff line number Diff line change
Expand Up @@ -146,13 +146,18 @@ def _save_settings() -> None:
_load_settings()

if mdl_path:
with open(mdl_path) as f:
mdl_schema = json.load(f)
data_source = mdl_schema["dataSource"].lower()
mdl_cache.set_mdl(dict_to_base64_string(mdl_schema))
models = mdl_schema.get("models", [])
total_columns = sum(len(m.get("columns", [])) for m in models)
print(f"Loaded MDL {f.name} ({len(models)} models, {total_columns} columns)") # noqa: T201
try:
with open(mdl_path) as f:
mdl_schema = json.load(f)
data_source = mdl_schema.get("dataSource", "").lower() or None
mdl_cache.set_mdl(dict_to_base64_string(mdl_schema))
models = mdl_schema.get("models", [])
total_columns = sum(len(m.get("columns", [])) for m in models)
print(f"Loaded MDL {f.name} ({len(models)} models, {total_columns} columns)") # noqa: T201
except FileNotFoundError:
print(f"MDL file not found at {mdl_path} — starting without a loaded MDL") # noqa: T201
except (json.JSONDecodeError, KeyError) as e:
print(f"Failed to parse MDL file {mdl_path}: {e} — starting without a loaded MDL") # noqa: T201
else:
print("No MDL_PATH environment variable found")

Expand Down
2 changes: 1 addition & 1 deletion skills/index.json
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@
},
{
"name": "wren-mcp-setup",
"version": "1.3",
"version": "1.4",
"description": "Set up Wren Engine MCP server via Docker and register it with an AI agent.",
"tags": [
"wren",
Expand Down
2 changes: 1 addition & 1 deletion skills/versions.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
"wren-connection-info": "1.5",
"wren-project": "1.5",
"wren-sql": "1.0",
"wren-mcp-setup": "1.3",
"wren-mcp-setup": "1.4",
"wren-quickstart": "1.3",
"wren-http-api": "1.0",
"wren-usage": "1.2"
Expand Down
12 changes: 6 additions & 6 deletions skills/wren-connection-info/SKILL.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,10 +45,10 @@ Read the linked reference file for the user's data source to get required fields
Most database connectors need: `host`, `port`, `user`, `password`, `database`.

Exceptions:
- **BigQuery** — uses `project_id`, `dataset_id`, `credentials_json_string` (base64-encoded). See [databases.md](references/databases.md) for encoding instructions.
- **Snowflake** — uses `account` instead of `host`, plus `sf_schema`.
- **BigQuery** — uses `project_id`, `dataset_id`, `credentials` (base64-encoded). See [databases.md](references/databases.md) for encoding instructions.
- **Snowflake** — uses `account` instead of `host`, plus `schema`.
- **Trino** — needs `catalog` and `schema` instead of `database`.
- **Databricks** — uses `server_hostname`, `http_path`, `access_token` (or service principal).
- **Databricks** — uses `serverHostname`, `httpPath`, `accessToken` (or service principal with `clientId`, `clientSecret`).
- **Spark** — only `host` and `port` (Spark Connect protocol, no auth fields).
- **File sources** — use `url`, `format`, plus bucket/credentials. See [file-sources.md](references/file-sources.md).

Expand All @@ -73,11 +73,11 @@ Never log, display, or pass sensitive values through the AI agent unnecessarily.
| Connector | Sensitive fields |
|-----------|-----------------|
| Postgres / MySQL / MSSQL / ClickHouse / Oracle / Doris / Redshift | `password` |
| BigQuery | `credentials_json_string` |
| BigQuery | `credentials` |
| Snowflake | `password` |
| Athena | `aws_access_key_id`, `aws_secret_access_key` |
| Databricks (token) | `access_token` |
| Databricks (service principal) | `client_id`, `client_secret` |
| Databricks (token) | `accessToken` |
| Databricks (service principal) | `clientId`, `clientSecret` |
| S3 / MinIO | `access_key`, `secret_key` |
| GCS | `key_id`, `secret_key`, `credentials` |
| Trino / Spark / Local files | (none) |
24 changes: 12 additions & 12 deletions skills/wren-connection-info/references/databases.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ Set `redshift_type` to `"redshift_iam"` in the connection info to use IAM auth.
|-------|-------------|-----------|
| `project_id` | GCP project ID | |
| `dataset_id` | Dataset name | |
| `credentials_json_string` | Base64-encoded service account JSON | ✓ |
| `credentials` | Base64-encoded service account JSON | ✓ |

**BigQuery credentials encoding**: Wren requires the service account JSON as a **base64-encoded string**, not the raw file.
After downloading `credentials.json` from GCP, run:
Expand All @@ -69,7 +69,7 @@ base64 -i credentials.json | tr -d '\n'
base64 -w 0 credentials.json
```

Paste the output into the `credentials_json_string` field.
Paste the output into the `credentials` field. The Web UI also supports uploading the JSON file directly.

---

Expand All @@ -81,7 +81,7 @@ Paste the output into the `credentials_json_string` field.
| `password` | Password | ✓ |
| `account` | Account identifier | |
| `database` | Database name | |
| `sf_schema` | Schema name | |
| `schema` | Schema name | |

---

Expand All @@ -101,7 +101,7 @@ Paste the output into the `credentials_json_string` field.
| Field | Description | Sensitive |
|-------|-------------|-----------|
| `s3_staging_dir` | S3 staging directory (`s3://bucket/prefix/`) | |
| `region` | AWS region | |
| `region_name` | AWS region | |
| `aws_access_key_id` | AWS access key ID | ✓ |
| `aws_secret_access_key` | AWS secret access key | ✓ |

Expand All @@ -111,19 +111,19 @@ Paste the output into the `credentials_json_string` field.

| Field | Description | Sensitive |
|-------|-------------|-----------|
| `server_hostname` | Workspace hostname (e.g. `dbc-xxx.cloud.databricks.com`) | |
| `http_path` | SQL warehouse HTTP path (e.g. `/sql/1.0/warehouses/xxx`) | |
| `access_token` | Personal access token | ✓ |
| `serverHostname` | Workspace hostname (e.g. `dbc-xxx.cloud.databricks.com`) | |
| `httpPath` | SQL warehouse HTTP path (e.g. `/sql/1.0/warehouses/xxx`) | |
| `accessToken` | Personal access token | ✓ |

## Databricks (service principal)

| Field | Description | Sensitive |
|-------|-------------|-----------|
| `server_hostname` | Workspace hostname | |
| `http_path` | SQL warehouse HTTP path | |
| `client_id` | OAuth M2M client ID | ✓ |
| `client_secret` | OAuth M2M client secret | ✓ |
| `azure_tenant_id` | Azure AD tenant ID (Azure Databricks only) | |
| `serverHostname` | Workspace hostname | |
| `httpPath` | SQL warehouse HTTP path | |
| `clientId` | OAuth M2M client ID | ✓ |
| `clientSecret` | OAuth M2M client secret | ✓ |
| `azureTenantId` | Azure AD tenant ID (Azure Databricks only) | |

---

Expand Down
10 changes: 7 additions & 3 deletions skills/wren-mcp-setup/SKILL.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ compatibility: Requires Docker Desktop (or Docker Engine).
license: Apache-2.0
metadata:
author: wren-engine
version: "1.3"
version: "1.4"
---

# Set Up Wren MCP via Docker
Expand Down Expand Up @@ -132,7 +132,7 @@ docker run -d \
ghcr.io/canner/wren-engine-ibis:latest
```

> If `MDL_PATH` is not set (or the file doesn't exist yet), the container starts without a loaded MDL. You can deploy later using the `deploy` MCP tool or the Web UI.
> If `MDL_PATH` is not set or the file doesn't exist yet, the container starts without a loaded MDL. You can deploy later using the `deploy` MCP tool or the Web UI.

This starts the container using the image `ghcr.io/canner/wren-engine-ibis:latest` with:

Expand Down Expand Up @@ -316,7 +316,11 @@ claude mcp add --transport http wren http://localhost:19000/mcp
```
A `405 Method Not Allowed` response means the endpoint is reachable but expects a POST — that is normal and indicates the MCP server is up.

### 4. Database connection refused inside the container
### 4. MCP tools fail with "Session not found" after container restart

Container restarts invalidate all active MCP sessions. The AI client still holds the old session ID, so every MCP call returns `"Session not found"`. **Start a new Claude Code session** (or restart your MCP client) to reconnect.

### 5. Database connection refused inside the container

If `health_check()` passes but queries fail with a connection error, the database host is likely still set to `localhost`. Open the Web UI at `http://localhost:9001`, edit the connection info, and change the host to `host.docker.internal`.

Expand Down
Loading