Conversation
|
Caution Review failedThe pull request is closed. WalkthroughAdds docker-compose orchestration and a new Node/Express-based wren-ai-service with LLM-backed insights (OpenAI/Ollama) and health endpoint. Updates UI server to handle missing-table SQLite errors gracefully. Introduces setup README, adjusts Docker-related files, and switches a submodule URL from SSH to HTTPS. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant U as User
participant UI as wren-ui (Frontend)
participant S as wren-ai-service
participant R as Routes (/insights)
participant SV as Insights Service
participant L as LLM Provider
note over UI,S: WREN_API_BASE -> http://wren-ai-service:7000
U->>UI: Request insights
UI->>S: POST /insights { sql, rows, columns }
S->>R: Route dispatch
alt INSIGHTS_ENABLED = true
R->>SV: createInsights(input)
SV->>SV: Build prompt (sample rows, columns, SQL)
opt LLM kind selection
SV->>L: getInsightsLlm() → OpenAI or Ollama
end
SV->>L: generate(prompt)
L-->>SV: insights text
SV-->>R: { insights }
R-->>S: 200 OK JSON
S-->>UI: insights payload
else disabled
R-->>S: 200 OK { insights: [], disabled: true }
S-->>UI: disabled response
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Suggested labels
Suggested reviewers
Poem
✨ Finishing touches
🧪 Generate unit tests
📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro ⛔ Files ignored due to path filters (1)
📒 Files selected for processing (15)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Summary by CodeRabbit
New Features
Bug Fixes
Documentation
Chores