fix(mcp-server,skills): fix onboarding blockers from test report#1478
fix(mcp-server,skills): fix onboarding blockers from test report#1478douenergy merged 4 commits intoCanner:mainfrom
Conversation
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
📝 WalkthroughWalkthroughAdds a base64 file-upload field to the MCP web UI, renames and adds datasource credential fields (BigQuery, Snowflake, Athena, Databricks), merges posted connection info with stored state to preserve omitted sensitive fields, hardens MDL loading, and updates related docs and skill metadata and troubleshooting. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant Browser
participant MCP_Server
participant StateStore
User->>Browser: choose credentials JSON file
Browser->>Browser: handleFileBase64(file) -> read ArrayBuffer, encode base64
Browser->>MCP_Server: POST connection form (may include `credentials` base64)
MCP_Server->>StateStore: read existing connection_info
MCP_Server->>MCP_Server: merge submitted conn_info with existing (preserve omitted sensitive fields)
MCP_Server->>StateStore: save merged connection_info
MCP_Server-->>Browser: respond (success/failure)
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Suggested labels
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
…names, file upload - Handle missing/invalid MDL file gracefully instead of crashing on startup - Fix BigQuery Web UI field names to match backend pydantic model (project→project_id, dataset→dataset_id, credentials_base64→credentials) - Add file_base64 input type so users can upload credentials.json directly instead of manually base64-encoding - Add Databricks data source to Web UI Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
e1e5983 to
a6251d4
Compare
There was a problem hiding this comment.
🧹 Nitpick comments (1)
mcp-server/app/templates/_fields.html (1)
41-59: Consider UTF-8 safety forbtoa()encoding.
btoa()only handles Latin-1 characters and will throw an error if the file contains multi-byte UTF-8 characters. While GCP service account JSON files are typically ASCII-only, a more robust approach would handle arbitrary UTF-8:♻️ Optional: UTF-8-safe base64 encoding
reader.onload = function() { - hidden.value = btoa(reader.result); + // UTF-8 safe base64 encoding + var bytes = new TextEncoder().encode(reader.result); + var binary = ''; + for (var i = 0; i < bytes.length; i++) { + binary += String.fromCharCode(bytes[i]); + } + hidden.value = btoa(binary); status.textContent = '✓ ' + file.name + ' loaded (' + Math.ceil(reader.result.length / 1024) + ' KB)';Note: The static analysis hints about "special characters must be escaped" and "id must be unique" are false positives — they don't account for Jinja2 template syntax and loop-generated unique IDs.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@mcp-server/app/templates/_fields.html` around lines 41 - 59, handleFileBase64 currently uses FileReader.readAsText and btoa(reader.result), which will fail for multi-byte UTF‑8; change the reader to read the file as binary (e.g., use readAsArrayBuffer or readAsDataURL) and produce a proper base64 string instead of btoa(reader.result). Specifically, in handleFileBase64 replace the readAsText + btoa(reader.result) flow with reading as an ArrayBuffer (or DataURL) and convert the resulting ArrayBuffer/Blob into a base64 payload, then set document.getElementById('fb64-'+fieldName).value to that base64 string and update the status element as before; keep the existing hidden/id pattern ('fb64-'+fieldName and 'fb64-status-'+fieldName) and error handling.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@mcp-server/app/templates/_fields.html`:
- Around line 41-59: handleFileBase64 currently uses FileReader.readAsText and
btoa(reader.result), which will fail for multi-byte UTF‑8; change the reader to
read the file as binary (e.g., use readAsArrayBuffer or readAsDataURL) and
produce a proper base64 string instead of btoa(reader.result). Specifically, in
handleFileBase64 replace the readAsText + btoa(reader.result) flow with reading
as an ArrayBuffer (or DataURL) and convert the resulting ArrayBuffer/Blob into a
base64 payload, then set document.getElementById('fb64-'+fieldName).value to
that base64 string and update the status element as before; keep the existing
hidden/id pattern ('fb64-'+fieldName and 'fb64-status-'+fieldName) and error
handling.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: e14a1295-ad29-4470-9ba2-56ddd7399c4d
📒 Files selected for processing (7)
docs/getting_started_with_claude_code.mdmcp-server/app/templates/_fields.htmlmcp-server/app/web.pymcp-server/app/wren.pyskills/versions.jsonskills/wren-connection-info/SKILL.mdskills/wren-mcp-setup/SKILL.md
…ooting - Fix BigQuery: credentials_json_string→credentials - Fix Snowflake: sf_schema→schema (use alias accepted by backend) - Fix Athena: region→region_name - Fix Databricks: use camelCase aliases (serverHostname, httpPath, accessToken) - Add "Session not found" troubleshooting to wren-mcp-setup and getting started doc - Bump wren-connection-info to v1.5, wren-mcp-setup to v1.4 Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
a6251d4 to
c402f5c
Compare
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@mcp-server/app/templates/_fields.html`:
- Around line 9-10: The hidden input currently injects full saved credentials
via value="{{ connection_info.get(field.name, '') }}" (id="fb64-{{ field.name
}}"), which exposes secrets; remove the prefilled credential value and instead
leave the hidden input empty or store only a non-sensitive flag/ID (e.g.,
boolean like connection_info.get(field.name) and/or a credential_id) so actual
credentials are never rendered into the DOM—update the template to clear the
value attribute and adjust any server-side/JS logic that relied on the raw value
to read the non-sensitive flag or fetch secrets securely from the server when
needed.
- Around line 47-50: The current reader.onload uses btoa(reader.result) and
reader.readAsText(file), which fails on non-Latin-1 UTF‑8 content; change to
read the file as an ArrayBuffer (replace reader.readAsText with
reader.readAsArrayBuffer) and set hidden.value to a UTF‑8‑safe base64 conversion
of the ArrayBuffer (use new Uint8Array(reader.result).toBase64() when available,
otherwise convert the Uint8Array to a binary string in safe chunks and call btoa
on that binary string) inside the reader.onload that currently references
hidden.value, status and file so uploads of JSON/non‑ASCII files succeed.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 321fc0d6-8cf4-46ae-b886-3cb1acf6a8c4
📒 Files selected for processing (9)
docs/getting_started_with_claude_code.mdmcp-server/app/templates/_fields.htmlmcp-server/app/web.pymcp-server/app/wren.pyskills/index.jsonskills/versions.jsonskills/wren-connection-info/SKILL.mdskills/wren-connection-info/references/databases.mdskills/wren-mcp-setup/SKILL.md
✅ Files skipped from review due to trivial changes (2)
- skills/versions.json
- skills/index.json
🚧 Files skipped from review as they are similar to previous changes (5)
- docs/getting_started_with_claude_code.md
- mcp-server/app/web.py
- mcp-server/app/wren.py
- skills/wren-mcp-setup/SKILL.md
- skills/wren-connection-info/SKILL.md
…coding - Don't prefill saved credentials into hidden DOM field; show status text instead to avoid exposing secrets in page source - Merge submitted form with existing connection info so omitted sensitive fields retain their saved values - Use readAsArrayBuffer + Uint8Array→btoa instead of readAsText + btoa to handle non-Latin-1 UTF-8 characters in JSON files Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@mcp-server/app/templates/_fields.html`:
- Around line 55-58: The onerror handler leaves the previous base64 in
hidden.value causing stale credentials to be submitted; update the
reader.onerror function (the handler that currently sets status and
status.style.color) to also clear the hidden input (set hidden.value to an empty
string) and, if the file input element is available in scope, reset it (e.g.,
set fileInput.value = '') so a failed read does not retain the previous file
data.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 2f64b7bf-c65b-4427-b1ec-bfd5bc9ab13a
📒 Files selected for processing (2)
mcp-server/app/templates/_fields.htmlmcp-server/app/web.py
🚧 Files skipped from review as they are similar to previous changes (1)
- mcp-server/app/web.py
Prevents stale credentials from being submitted when a replacement file upload fails to read. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Summary
Fixes 6 issues discovered during onboarding testing with BigQuery (see test report):
MDL_PATHfile read in try/except — start without MDL instead of crashingproject→project_id,dataset→dataset_id,credentials_base64→credentials) to match backend pydantic modelcredentials.jsonclient-sidewren-connection-infofor BigQuery (credentials), Snowflake (schema), Athena (region_name), Databricks (camelCase aliases)Test plan
target/mdl.json— should log warning and start normallyMDL_PATH— should log parse error and start normallyproject_id,dataset_id, and file upload for credentialscredentials.jsonvia file picker → verify base64 value is sent on form submitserverHostname,httpPath,accessTokenfields appearwren-connection-infoskill documents correct field names for all data sources🤖 Generated with Claude Code
Summary by CodeRabbit
New Features
Improvements
Documentation