diff --git a/.claude/commands/converge.md b/.claude/commands/converge.md index 086509941..1d95fe626 100644 --- a/.claude/commands/converge.md +++ b/.claude/commands/converge.md @@ -28,11 +28,12 @@ Cycle | Gate | Review Critical | Review Important | Regressions | Verdict ### Step 2: Run Cycle **A. Clear Stale State** -Before running gates, clear the previous gate result in `.workflow/state.json`: -1. Read the file (create `{}` if missing) -2. Set `gates.passed` to `null` -3. Set `gates.commit_ref` to `null` -4. Write the file back +Before running gates, clear the previous gate result: + +```bash +python scripts/workflow-state.py set-null gates.passed +python scripts/workflow-state.py set-null gates.commit_ref +``` This ensures that a fix cycle cannot accidentally rely on a stale gate pass. diff --git a/.claude/commands/gates.md b/.claude/commands/gates.md index bcf714dbc..dcaded614 100644 --- a/.claude/commands/gates.md +++ b/.claude/commands/gates.md @@ -118,7 +118,7 @@ Fail: snapshot mismatch — report which snapshots differ **Gate 6: AC Verification** (if specs with ACs are relevant) -Read `specs/README.md` to map changed files to specs. For each relevant spec with ACs: +To find governing specs: grep all `specs/*.md` files for `**Code:**` lines. Match changed file paths against the code path prefixes in each spec's frontmatter. For each relevant spec with ACs: - Extract test method names from the AC table - For .NET tests: `dotnet test --filter "FullyQualifiedName~{method}" -v q --no-build` - For TypeScript tests: `npx vitest run -t "{method}" --prefix src/PPDS.Extension` @@ -158,11 +158,12 @@ Gates are necessary but NOT sufficient. After gates pass: ## Workflow State -After all gates pass (verdict is PASS), update `.workflow/state.json`: -1. Read the file (create `{}` if missing) -2. Set `gates.passed` to the current ISO 8601 timestamp -3. Set `gates.commit_ref` to current HEAD SHA (from `git rev-parse HEAD`) -4. Write the file back +After all gates pass (verdict is PASS), run: + +```bash +python scripts/workflow-state.py set gates.passed now +python scripts/workflow-state.py set gates.commit_ref "$(git rev-parse HEAD)" +``` ## Rules diff --git a/.claude/commands/implement.md b/.claude/commands/implement.md index bf5b19879..f92bf204f 100644 --- a/.claude/commands/implement.md +++ b/.claude/commands/implement.md @@ -29,20 +29,10 @@ Before dispatching any agents, load the specification context that will be injec **A. Read Foundation** - Read `specs/CONSTITUTION.md` — full content will be injected into every subagent prompt -- Read `specs/README.md` — maps code paths to specs **B. Identify Relevant Specs** - From the plan, identify which source directories/files will be touched -- Map each to its spec using the README.md code column: - - `src/PPDS.Dataverse/Pooling/` → `specs/connection-pooling.md` - - `src/PPDS.Dataverse/Query/` → `specs/query.md` - - `src/PPDS.Cli/Tui/` → `specs/tui.md` + `specs/tui-foundation.md` - - `src/PPDS.Cli/Commands/` → `specs/cli.md` - - `src/PPDS.Mcp/` → `specs/mcp.md` - - `src/PPDS.Migration/` → `specs/migration.md` - - `src/PPDS.Auth/` → `specs/authentication.md` - - `src/PPDS.Extension/src/panels/` → `specs/per-panel-environment-scoping.md` (if panels) or relevant spec - - `src/PPDS.Extension/` → check `specs/README.md` for extension-related specs +- Grep all `specs/*.md` files for `**Code:**` frontmatter lines. Match each touched source directory against code path prefixes to find governing specs. This replaces hardcoded mappings and the README index. - Always include `specs/architecture.md` - Read each relevant spec and extract the `## Acceptance Criteria` section @@ -62,6 +52,8 @@ with a spec or constitution principle, STOP and report the conflict to the orchestrator — do not silently deviate. ``` +**Note:** When creating new code paths for a spec (new directories, new files in new locations), update the spec’s `**Code:**` frontmatter to include the new paths. This ensures frontmatter grep continues to discover the correct specs. + ### Step 3: Assess Current State - Check git status and current branch - Search for any existing work (worktrees, branches) related to this plan @@ -71,13 +63,14 @@ to the orchestrator — do not silently deviate. ### Step 3.5: Initialize Workflow State -Update `.workflow/state.json` to record that implementation has started: -1. Read the file (create `{}` if missing) -2. Set `branch` to the current branch name -3. Set `spec` to the path of the primary spec associated with the plan -4. Set `plan` to the plan file path ($ARGUMENTS) -5. Set `started` to the current ISO 8601 timestamp -6. Write the file back +Record that implementation has started: + +```bash +python scripts/workflow-state.py set branch "$(git rev-parse --abbrev-ref HEAD)" +python scripts/workflow-state.py set spec "{spec-path}" +python scripts/workflow-state.py set plan "$ARGUMENTS" +python scripts/workflow-state.py set started now +``` ### Step 4: Create Task Tracking - Use TaskCreate to build a task list from the plan phases @@ -198,7 +191,7 @@ If `/review` finds critical or important issues, invoke `/converge` to run the f 3. **Parallel by default** - if tasks don't depend on each other, run them simultaneously 4. **Sequential when required** - respect phase gates and dependency chains 5. **One commit per phase** - each phase gate produces exactly one commit with a clear message -6. **Review before commit** - always use code-reviewer agent before committing phase work +6. **Review before commit** - invoke `/review` before committing phase work 7. **Fix before advancing** - if build fails, tests fail, or review finds issues, fix them BEFORE committing. Dispatch fix agents rather than debugging yourself. 8. **Never skip verification** - always build + test + review before declaring a phase complete 9. **Continue until done** - execute ALL phases in the plan, don't stop early and ask permission diff --git a/.claude/commands/qa.md b/.claude/commands/qa.md index 88693da43..f93538155 100644 --- a/.claude/commands/qa.md +++ b/.claude/commands/qa.md @@ -418,13 +418,13 @@ Present the final merged report to the user. ## Workflow State -After QA passes for a surface (verdict is PASS — all checks green), update `.workflow/state.json`: -1. Read the file (create `{}` if missing) -2. Set `qa.{surface}` to the current ISO 8601 timestamp -3. Surface key matches mode: `ext`, `tui`, `mcp`, `cli` -4. Write the file back +After QA passes for a surface (verdict is PASS — all checks green), run: -Example: after `/qa extension` passes, set `qa.ext` to the timestamp. +```bash +python scripts/workflow-state.py set qa.{surface} now +``` + +Surface key matches mode: `ext`, `tui`, `mcp`, `cli`. Example: `/qa extension` → `qa.ext`. ## Rules diff --git a/.claude/commands/reset.md b/.claude/commands/reset.md index eec26977c..3c5de47fd 100644 --- a/.claude/commands/reset.md +++ b/.claude/commands/reset.md @@ -4,14 +4,13 @@ Clear workflow state for the current branch. ## Usage -`/reset` — Delete `.workflow/state.json` and start fresh. +`/reset` — Delete workflow state and start fresh. ## Process -1. Check if `.workflow/state.json` exists -2. If it exists, delete it -3. Print confirmation: "Workflow state cleared. Run /gates to begin tracking." -4. If it doesn't exist, print: "No workflow state to clear." +```bash +python scripts/workflow-state.py delete +``` ## When to Use diff --git a/.claude/commands/review.md b/.claude/commands/review.md index 391fc817c..fcc9a244c 100644 --- a/.claude/commands/review.md +++ b/.claude/commands/review.md @@ -28,7 +28,7 @@ If $ARGUMENTS specifies a scope, filter the diff to those paths only. ### Step 2: Load Spec Context (NO implementation context) - Read `specs/CONSTITUTION.md` -- Map changed files to specs via `specs/README.md` +- Map changed files to specs by grepping all `specs/*.md` files for `**Code:**` frontmatter lines. Match changed file paths against code path prefixes to find governing specs. - Read each relevant spec — extract ONLY the `## Acceptance Criteria` section - Do NOT read any plan files, task descriptions, or implementation notes @@ -106,11 +106,14 @@ Include total counts and a clear verdict: ## Workflow State -After review completes (all findings evaluated and verdict rendered), update `.workflow/state.json`: -1. Read the file (create `{}` if missing) -2. Set `review.passed` to the current ISO 8601 timestamp -3. Set `review.findings` to the total count of findings (critical + important + suggestion) -4. Write the file back +After review completes (all findings evaluated and verdict rendered), run: + +```bash +python scripts/workflow-state.py set review.passed now +python scripts/workflow-state.py set review.findings {count} +``` + +Where `{count}` is the total findings (critical + important + suggestion). ## Rules diff --git a/.claude/commands/spec-audit.md b/.claude/commands/spec-audit.md index 467bc702e..ca7821a13 100644 --- a/.claude/commands/spec-audit.md +++ b/.claude/commands/spec-audit.md @@ -12,7 +12,6 @@ $ARGUMENTS = spec name for single audit (e.g., `connection-pooling`), or empty f Read these files: - `specs/CONSTITUTION.md` — principles to check against -- `specs/README.md` — index of all specs and their code mappings ### Step 2: Determine Scope @@ -21,7 +20,7 @@ Read these files: - Proceed to Step 3 with this one spec **Full audit** (`$ARGUMENTS` empty): -- Read `specs/README.md` to get list of all specs +- Glob `specs/*.md` (excluding CONSTITUTION.md, SPEC-TEMPLATE.md, README.md) to get the list of all specs - For each spec, dispatch a parallel subagent (use `Agent` tool with `subagent_type: "general-purpose"`) with the audit prompt below - Collect all results and produce a summary diff --git a/.claude/commands/spec.md b/.claude/commands/spec.md index aa6e0ca4f..0c472c26e 100644 --- a/.claude/commands/spec.md +++ b/.claude/commands/spec.md @@ -1,6 +1,6 @@ # Spec -Create or update a specification following PPDS conventions. Ensures consistency, cross-references related specs, and enforces numbered acceptance criteria. +Create, update, or reconcile a specification following PPDS conventions. Three modes: forward (create new), update (modify existing), reconcile (align spec with code). Ensures consistency, cross-references related specs, and enforces numbered acceptance criteria. ## Input @@ -11,34 +11,42 @@ $ARGUMENTS = spec name (e.g., `connection-pooling` for existing, `new-feature` f ### Step 1: Load Foundation Read these files before doing anything else: -- `specs/CONSTITUTION.md` — non-negotiable principles +- `specs/CONSTITUTION.md` — non-negotiable principles (includes Spec Laws SL1–SL5) - `specs/SPEC-TEMPLATE.md` — structural template -- `specs/README.md` — index of all specs and their code mappings +- Glob `specs/*.md` and grep each for `**Code:**` lines to build a code-path-to-spec map. This replaces the README index. ### Step 2: Determine Mode -**If spec exists** (`specs/$ARGUMENTS.md` found): +**Forward mode** — spec does not exist (`specs/$ARGUMENTS.md` not found): +- Search existing specs for overlapping scope — check that this isn't already covered +- If overlap found, ask user: update existing spec or create new one? +- Proceed to authoring (Step 4) + +**Update mode** — spec exists, user requests specific changes: - Read the existing spec - Read the code files referenced in the spec's `Code:` header line - Identify drift: does the code match what the spec describes? - Identify missing ACs: does the spec have numbered acceptance criteria? - Present findings to user before making changes - -**If spec is new** (`specs/$ARGUMENTS.md` not found): -- Search existing specs for overlapping scope — check that this isn't already covered -- If overlap found, ask user: update existing spec or create new one? -- Proceed to authoring +- Proceed to authoring (Step 4) + +**Reconcile mode** — spec exists, user wants to align spec with code (`/spec --reconcile` or when significant code divergence is detected): +- Read the existing spec and extract all ACs, Code paths, and type descriptions +- Read the code at the spec's `Code:` paths +- Enumerate public types, methods, and behaviors in the code +- Compare against existing ACs: + - **Missing ACs:** Code behavior with no corresponding AC → propose new ACs + - **Stale ACs:** AC describes behavior the code no longer has → flag for removal + - **Status drift:** AC references a test that now passes/fails differently → flag for update +- **Member-count verification:** Count public types/methods in code, count ACs. Report the delta. If ACs cover less than 90% of public surface area, warn explicitly. This prevents the known LLM summarization problem (capturing ~70-80% and missing the rest). +- Compare spec prose (Architecture, Core Requirements) against actual code structure +- Present all proposed changes to user for approval before writing +- After writing: regenerate README, append to Changelog +- Skip to Step 6 (reconcile does not go through the authoring flow) ### Step 3: Cross-Reference Related Specs -Based on the spec's domain, read related specs: -- Touches `src/PPDS.Dataverse/` → read `specs/connection-pooling.md`, `specs/architecture.md` -- Touches `src/PPDS.Cli/Tui/` → read `specs/tui.md`, `specs/tui-foundation.md` -- Touches `src/PPDS.Cli/Commands/` → read `specs/cli.md` -- Touches `src/PPDS.Mcp/` → read `specs/mcp.md` -- Touches `src/PPDS.Migration/` → read `specs/migration.md` -- Touches `src/PPDS.Auth/` → read `specs/authentication.md` -- Always read `specs/architecture.md` for cross-cutting patterns +Use the code-path-to-spec map from Step 1 to find related specs. Match the spec's Code: paths against the directories being touched. Always include `specs/architecture.md` and `specs/CONSTITUTION.md`. ### Step 4: Author/Update Spec @@ -64,7 +72,7 @@ If the user tries to skip ACs, remind them: Constitution I3 requires numbered ac ### Step 6: Finalize 1. Write/update the spec file at `specs/$ARGUMENTS.md` -2. Update `specs/README.md` if this is a new spec (add to appropriate table) +2. Regenerate `specs/README.md` by running `python scripts/generate-spec-readme.py` (or manually scraping frontmatter from all specs if the script doesn't exist yet) 3. Commit: ``` docs(specs): add/update {spec-name} specification @@ -74,7 +82,7 @@ If the user tries to skip ACs, remind them: Constitution I3 requires numbered ac ## Rules -1. **Always read foundation first** — constitution, template, README. No exceptions. +1. **Always read foundation first** — constitution and spec template. No exceptions. 2. **Always cross-reference** — no spec exists in isolation. 3. **ACs are mandatory** — the spec is incomplete without them. 4. **One section at a time** — don't dump the entire spec at once. diff --git a/.claude/commands/verify.md b/.claude/commands/verify.md index b3445fad8..0f3991398 100644 --- a/.claude/commands/verify.md +++ b/.claude/commands/verify.md @@ -178,13 +178,13 @@ Present structured results: ## Workflow State -After verification passes for a surface (verdict is PASS), update `.workflow/state.json`: -1. Read the file (create `{}` if missing) -2. Set `verify.{surface}` to the current ISO 8601 timestamp -3. Surface key matches mode: `ext`, `tui`, `mcp`, `cli` -4. Write the file back +After verification passes for a surface (verdict is PASS), run: -Example: after `/verify extension` passes, set `verify.ext` to the timestamp. +```bash +python scripts/workflow-state.py set verify.{surface} now +``` + +Surface key matches mode: `ext`, `tui`, `mcp`, `cli`. Example: `/verify extension` → `verify.ext`. ## Rules diff --git a/.claude/skills/ext-panels/SKILL.md b/.claude/skills/ext-panels/SKILL.md index 4601b62aa..7c44d6f93 100644 --- a/.claude/skills/ext-panels/SKILL.md +++ b/.claude/skills/ext-panels/SKILL.md @@ -453,7 +453,7 @@ When designing a new panel or feature, design across all four PPDS surfaces: Dae Read before doing anything: - `specs/CONSTITUTION.md` — A1, A2 govern multi-surface design (services are single code path, UI is thin wrapper) - `specs/architecture.md` — cross-cutting patterns -- Domain-relevant specs via `specs/README.md` +- Domain-relevant specs via frontmatter grep — search `specs/*.md` for `**Code:**` lines matching `src/PPDS.Extension/` ### Inventory Existing Infrastructure diff --git a/.claude/skills/pr/SKILL.md b/.claude/skills/pr/SKILL.md index da040a821..f3eaa59e9 100644 --- a/.claude/skills/pr/SKILL.md +++ b/.claude/skills/pr/SKILL.md @@ -130,24 +130,16 @@ Awaiting your review. ### 6. Write Workflow State After PR is created: -```json -{ - "pr": { - "url": "https://github.com/...", - "created": "2026-03-16T17:00:00Z" - } -} + +```bash +python scripts/workflow-state.py set pr.url "{pr-url}" +python scripts/workflow-state.py set pr.created now ``` -After Gemini comments are triaged (step 4 complete), update workflow state: -```json -{ - "pr": { - "url": "https://github.com/...", - "created": "2026-03-16T17:00:00Z", - "gemini_triaged": true - } -} +After Gemini comments are triaged (step 4 complete): + +```bash +python scripts/workflow-state.py set pr.gemini_triaged true ``` The stop hook will BLOCK the session from ending if `gemini_triaged` is not set after PR creation. Do not skip step 4. diff --git a/.claude/skills/start/SKILL.md b/.claude/skills/start/SKILL.md index b081f5b0c..54a02fc81 100644 --- a/.claude/skills/start/SKILL.md +++ b/.claude/skills/start/SKILL.md @@ -52,13 +52,10 @@ git branch --list "feature/" ### 5. Initialize Workflow State -Write `.workflow/state.json` in the worktree: +Initialize workflow state in the worktree: -```json -{ - "branch": "feature/", - "started": "" -} +```bash +python scripts/workflow-state.py init "feature/" ``` ### 6. Transfer Session Context diff --git a/.claude/skills/write-skill/SKILL.md b/.claude/skills/write-skill/SKILL.md index 9fb8c4dd2..37903c69a 100644 --- a/.claude/skills/write-skill/SKILL.md +++ b/.claude/skills/write-skill/SKILL.md @@ -65,31 +65,33 @@ Skills that represent workflow steps should write to `.workflow/state.json` on c **When to write state:** Only if the skill represents a gate that hooks check (gates, verify, QA, review). Supporting knowledge skills (ext-verify, tui-verify, cli-verify, mcp-verify) do NOT write state — the orchestrator skill that invokes them does. -**How to write state:** +**How to write state:** Use the utility script — never write JSON by hand: +```bash +python scripts/workflow-state.py set ``` -After successful completion, read `.workflow/state.json` (create if missing), -update the relevant field with an ISO 8601 timestamp, and write the file back. + +Magic values: `now` → UTC ISO timestamp, `true`/`false` → boolean, digits → integer. Example for /gates: -{ - "gates": { - "passed": "2026-03-16T16:00:00Z", - "commit_ref": "" - } -} +```bash +python scripts/workflow-state.py set gates.passed now +python scripts/workflow-state.py set gates.commit_ref "$(git rev-parse HEAD)" ``` **State fields by skill:** -| Skill | Field | Value | -|-------|-------|-------| -| `/gates` | `gates.passed`, `gates.commit_ref` | ISO timestamp, HEAD SHA | -| `/verify` | `verify.{surface}` | ISO timestamp | -| `/qa` | `qa.{surface}` | ISO timestamp | -| `/review` | `review.passed`, `review.findings` | ISO timestamp, finding count | -| `/implement` | `branch`, `spec`, `plan`, `started` | strings, ISO timestamp | -| `/pr` | `pr.url`, `pr.created` | URL string, ISO timestamp | +| Skill | Command | +|-------|---------| +| `/gates` | `set gates.passed now` + `set gates.commit_ref {HEAD}` | +| `/verify` | `set verify.{surface} now` | +| `/qa` | `set qa.{surface} now` | +| `/review` | `set review.passed now` + `set review.findings {count}` | +| `/implement` | `set branch {name}` + `set spec {path}` + `set plan {path}` + `set started now` | +| `/pr` | `set pr.url {url}` + `set pr.created now` | +| `/start` | `init {branch-name}` (creates fresh state file) | +| `/converge` | `set-null gates.passed` + `set-null gates.commit_ref` | +| `/reset` | `delete` (removes state file) | ## Skill Categories @@ -110,6 +112,6 @@ When creating a new skill: 1. Name follows `{action}` or `{action}-{qualifier}` convention 2. SKILL.md has frontmatter with `name` and `description` 3. Description is written for AI discoverability (trigger words, not technology) -4. If it represents a workflow gate, it writes to `.workflow/state.json` +4. If it represents a workflow gate, it calls `python scripts/workflow-state.py` 5. If it references other skills, it uses current names (not old names) 6. Directory is `.claude/skills//SKILL.md` diff --git a/CLAUDE.md b/CLAUDE.md index f63966ade..ee80fd5f5 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -44,9 +44,8 @@ SDK, CLI, TUI, VS Code Extension, and MCP server for Power Platform development. ## Specs -- Constitution: `specs/CONSTITUTION.md` — read before any work +- Constitution: `specs/CONSTITUTION.md` — read before any work (includes Spec Laws SL1–SL5) - Template: `specs/SPEC-TEMPLATE.md` -- Index: `specs/README.md` ## Extension Versioning diff --git a/specs/tui-foundation.md b/docs/plans/2026-02-05-tui-foundation-spec.md similarity index 100% rename from specs/tui-foundation.md rename to docs/plans/2026-02-05-tui-foundation-spec.md diff --git a/specs/vscode-persistence-and-solutions-polish.md b/docs/plans/2026-03-14-vscode-persistence-and-solutions-polish.md similarity index 100% rename from specs/vscode-persistence-and-solutions-polish.md rename to docs/plans/2026-03-14-vscode-persistence-and-solutions-polish.md diff --git a/docs/plans/2026-03-18-panel-parity-project.md b/docs/plans/2026-03-18-panel-parity-project.md new file mode 100644 index 000000000..b5ab6423f --- /dev/null +++ b/docs/plans/2026-03-18-panel-parity-project.md @@ -0,0 +1,133 @@ +# Panel Parity Project Coordination + +**Created:** 2026-03-18 +**Source:** Extracted from `specs/panel-parity.md` during SL1 decomposition +**Purpose:** Project coordination content (shared patterns, phasing, issue tracking) that does not belong in individual feature specs + +--- + +## Shared Patterns (All Panels) + +| Pattern | Description | Status | +|---------|-------------|--------| +| Virtual table | `DataTable` component — sortable columns, row selection, keyboard nav (`data-table.ts`) | Done | +| Environment picker | Toolbar dropdown with environment theming via `data-env-type` / `data-env-color` CSS attributes | Done | +| Solution filter | `SolutionFilter` component — dropdown with `storageKey`-based persistence (`solution-filter.ts`) | Done (ConnRefs, EnvVars); WebResources uses raw ``) | +| `resolveEnvironmentId` + `openInMaker` | AC-PT-18 | PluginTraces | + +--- + +## Roadmap + +- **Connection Picker (option c):** Add connection binding dropdown to Connection References panel after parity +- **Plugin Registration:** Full plugin registration panels across all surfaces (separate spec) +- **Solution History:** Expand Import Jobs to include exports/publishes/uninstalls (matches Maker solutionsHistory) +- **Bulk operations UI:** Surface existing bulk operation infrastructure in panel actions diff --git a/docs/plans/2026-03-18-query-engine-roadmap.md b/docs/plans/2026-03-18-query-engine-roadmap.md new file mode 100644 index 000000000..bbe39b91f --- /dev/null +++ b/docs/plans/2026-03-18-query-engine-roadmap.md @@ -0,0 +1,355 @@ +# Query Engine Roadmap + +> Roadmap extracted from specs/query-engine-v2.md during spec governance restructuring. +> Phase 0 (Foundation) has been absorbed into specs/query.md as core architecture. +> This file retains the unimplemented phases (1-7) as a roadmap for future development. + +**Clean Room:** All implementations derive from Microsoft FetchXML documentation, T-SQL language specification (ANSI SQL:2016), and Dataverse SDK documentation. No third-party query engine code is referenced or adapted. + +**Tech Stack:** C# (.NET 8+), Terminal.Gui 1.19+, Microsoft.Data.SqlClient (Phase 3.5), xUnit + +--- + +## Phase Dependency Graph + +``` +Phase 0 (Foundation) — IMPLEMENTED, see specs/query.md + ├── Phase 1 (Core Gaps) ← requires expression evaluator, plan layer + │ ├── Phase 2 (Composition) ← requires planner rewrite rules + │ └── Phase 3 (Functions) ← requires expression evaluator + function registry + │ └── Phase 3.5 (TDS) ← requires plan layer for routing + ├── Phase 4 (Parallel) ← requires plan layer + pool integration + ├── Phase 5 (DML) ← requires expression evaluator + plan layer + ├── Phase 6 (Metadata) ← requires plan layer for MetadataScanNode + └── Phase 7 (Advanced) ← requires all of the above +``` + +Phases 1, 2, 3 can proceed in parallel after Phase 0. +Phases 4, 5, 6 can proceed in parallel after Phase 1. +Phase 3.5 can proceed after Phase 3 (or in parallel with reduced scope). +Phase 7 is the final phase. + +--- + +## Phase 1: Core SQL Gaps + +**Goal:** HAVING, CASE/IIF, basic computed column expressions, COUNT(*) optimization. + +### HAVING Clause + +**Parser:** Add `SqlTokenType.Having`. After GROUP BY parsing, look for `HAVING` keyword. Parse condition into `ISqlCondition` stored in `SqlSelectStatement.Having`. + +**Planning:** FetchXML does NOT support filtering on aggregate results, so HAVING always produces a `ClientFilterNode` after the `FetchXmlScanNode`. + +``` +FetchXmlScanNode (with aggregate=true) → ClientFilterNode(having condition) → ProjectNode +``` + +**ClientFilterNode:** Iterates input rows, yields only those where `evaluator.EvaluateCondition(condition, row)` is true. + +### CASE / IIF Expressions + +**Parser:** Add `CASE`, `WHEN`, `THEN`, `ELSE`, `END`, `IIF` keywords. + +Both produce `ISqlExpression` nodes (`SqlCaseExpression`, `SqlIifExpression`). They appear in SELECT column list, WHERE/HAVING conditions, and ORDER BY. + +**Planning:** CASE/IIF in SELECT creates a `ProjectNode` with computed output columns. In WHERE, pushable parts go server-side; CASE comparison uses `ClientFilterNode`. + +### Computed Column Expressions + +`revenue * 0.1 AS tax` → `SqlComputedColumn(SqlBinaryExpression(revenue, Multiply, 0.1), "tax")` + +The `FetchXmlScanNode` requests the base columns needed by the expressions; `ProjectNode` evaluates them using the expression evaluator. + +### COUNT(*) Optimization + +For unfiltered `SELECT COUNT(*) FROM entity` (no WHERE, no JOIN, no GROUP BY), use `RetrieveTotalRecordCountRequest` (near-instant metadata read) via `CountOptimizedNode`. Fall back to aggregate FetchXML if unsupported. + +### Condition Expressions (WHERE with expressions) + +Extend WHERE to allow `ISqlExpression` on both sides of comparisons. Column-to-column and computed expressions use `ClientFilterNode`; literal comparisons push to FetchXML. + +### Deliverables + +| Item | Tests | +|------|-------| +| HAVING parsing + AST | Parser tests for HAVING with aggregates | +| ClientFilterNode | Filter node with mock data | +| CASE/IIF parsing + AST | Parser tests for nested CASE, IIF | +| CASE/IIF evaluation | Expression evaluator tests | +| Computed columns parsing | `revenue * 0.1 AS tax` parses correctly | +| Computed column projection | ProjectNode evaluates arithmetic | +| COUNT(*) optimization | CountOptimizedNode with mock service | +| Expression conditions | `WHERE revenue > cost` parsed and planned | + +--- + +## Phase 2: Query Composition + +**Goal:** IN/EXISTS subqueries, UNION/UNION ALL. + +### IN Subquery -> JOIN Rewrite + +```sql +SELECT name FROM account +WHERE accountid IN (SELECT parentaccountid FROM opportunity WHERE revenue > 1000000) +``` + +The planner rewrites to an INNER JOIN at the AST level before transpiling to FetchXML, so Dataverse handles the join server-side. When rewrite isn't possible (correlated subqueries), fall back to two-phase execution or `ClientFilterNode` with hash set lookup. + +### EXISTS Subquery + +EXISTS with correlated reference -> rewrite to INNER JOIN. NOT EXISTS -> LEFT JOIN + IS NULL. Both produce FetchXML ``. + +### UNION / UNION ALL + +**AST:** `SqlUnionStatement` with list of `SqlSelectStatement` queries and `IsUnionAll` flags per boundary. + +**Planning:** `ConcatenateNode` yields rows from each child. `DistinctNode` deduplicates for UNION (without ALL) using `HashSet`. + +### Deliverables + +| Item | Tests | +|------|-------| +| IN subquery parsing | `WHERE id IN (SELECT ...)` parses | +| IN -> JOIN rewrite | Rewrite produces correct FetchXML | +| IN fallback (large results) | Hash set client filter works | +| EXISTS parsing | `WHERE EXISTS (SELECT ...)` parses | +| EXISTS -> JOIN rewrite | Correlated EXISTS becomes INNER JOIN | +| NOT EXISTS -> LEFT JOIN | Produces IS NULL filter | +| UNION/UNION ALL parsing | Multiple SELECTs parse into SqlUnionStatement | +| ConcatenateNode | Yields rows from both children | +| DistinctNode | Deduplicates on UNION | +| Column count validation | Mismatched UNION column counts error | + +--- + +## Phase 3: Functions + +**Goal:** String functions, date functions, CAST/CONVERT. Client-side evaluated with server-side pushdown where FetchXML supports it. + +### String Functions + +| Function | Signature | Notes | +|----------|-----------|-------| +| `UPPER(expr)` | -> string | Client-side | +| `LOWER(expr)` | -> string | Client-side | +| `LEN(expr)` | -> int | Client-side | +| `LEFT(expr, n)` | -> string | Client-side | +| `RIGHT(expr, n)` | -> string | Client-side | +| `SUBSTRING(expr, start, length)` | -> string | 1-based start per T-SQL | +| `TRIM(expr)` / `LTRIM` / `RTRIM` | -> string | Client-side | +| `REPLACE(expr, find, replace)` | -> string | Client-side | +| `CHARINDEX(find, expr [, start])` | -> int | 1-based result, 0 = not found | +| `CONCAT(expr, expr, ...)` | -> string | Variadic, NULL-safe | +| `STUFF(expr, start, length, replace)` | -> string | Client-side | +| `REVERSE(expr)` | -> string | Client-side | + +Functions are registered in a `FunctionRegistry` keyed by name. The expression evaluator dispatches `SqlFunctionExpression` to the registry. + +### Date Functions + +| Function | Signature | Notes | +|----------|-----------|-------| +| `GETDATE()` / `GETUTCDATE()` | -> datetime | Current UTC time | +| `YEAR(expr)` / `MONTH(expr)` / `DAY(expr)` | -> int | FetchXML-native in GROUP BY | +| `DATEADD(part, n, expr)` | -> datetime | Client-side | +| `DATEDIFF(part, start, end)` | -> int | Client-side | +| `DATEPART(part, expr)` | -> int | Client-side | +| `DATETRUNC(part, expr)` | -> datetime | Client-side | + +**Server pushdown for GROUP BY:** `YEAR/MONTH/DAY(column)` in GROUP BY maps to native FetchXML date grouping (`dategrouping="year"`). + +### CAST / CONVERT + +Supported target types: `int`, `bigint`, `decimal(p,s)`, `float`, `nvarchar(n)`, `varchar(n)`, `datetime`, `date`, `bit`, `uniqueidentifier`, `money`. + +--- + +## Phase 3.5: TDS Endpoint Acceleration + +**Goal:** Optional acceleration path for read queries using the Dataverse TDS Endpoint (SQL Server wire protocol against a read-only replica). + +> **Note:** TDS integration (DI wiring, UI toggle, error handling) is tracked in the query parity project plan. This phase covers the query engine routing and compatibility checking aspects. + +### TDS Compatibility Check + +A query is TDS-compatible when: +- It is a SELECT (no DML) +- All referenced entities support TDS (no elastic tables, no virtual entities) +- No PPDS-specific features used (virtual *name columns) +- The SQL is expressible in standard T-SQL + +### TdsScanNode + +Executes SQL directly against TDS endpoint via `SqlConnection`, yields rows via `SqlDataReader`. + +### Auth Integration + +No new auth flow needed. TDS endpoint accepts the same OAuth bearer token via `SqlConnection.AccessToken`. + +--- + +## Phase 4: Parallel Execution Intelligence + +**Goal:** Pool-aware parallel aggregate partitioning, parallel page fetching, EXPLAIN command. + +### Parallel Aggregate Partitioning + +When an aggregate query fails with the 50K limit error, retry with partitioned strategy: +1. Estimate record count via `RetrieveTotalRecordCountRequest` +2. Calculate partition count: `ceil(estimatedCount / 40000)` +3. Generate N date-range partitions with non-overlapping `createdon` filters +4. Execute ALL partitions in parallel across the connection pool +5. Merge-aggregate partial results (COUNT->sum, SUM->sum, AVG->sum/count, MIN->min, MAX->max) + +``` +ParallelPartitionNode +├── FetchXmlScanNode (partition 1, aggregate) +├── FetchXmlScanNode (partition 2, aggregate) +└── MergeAggregateNode (combines partial aggregates) +``` + +With pool capacity of 48, all partitions execute simultaneously — 48x faster than single-connection tools. + +### Parallel Page Prefetch + +`PrefetchScanNode` wraps `FetchXmlScanNode`, using a bounded `Channel` to fetch pages ahead while the consumer processes current results. + +### EXPLAIN Command + +``` +EXPLAIN SELECT COUNT(*) FROM account GROUP BY ownerid +``` + +Output: Plan tree with estimated rows, partition info, pool capacity, effective parallelism. + +**CLI:** `ppds query sql "..." --explain` or `ppds query explain "..."` +**TUI:** Ctrl+Shift+E in SQL query screen + +--- + +## Phase 5: DML via SQL + +**Goal:** INSERT, UPDATE, DELETE via SQL syntax, leveraging existing `IBulkOperationExecutor` infrastructure. + +### Safety Model + +- `DELETE FROM entity` (no WHERE) BLOCKED at parse time +- `UPDATE entity SET ...` (no WHERE) BLOCKED at parse time +- Before execution: show estimated row count, require confirmation +- CLI: `--confirm` flag or interactive prompt; `--dry-run` shows plan without executing +- Default row cap: 10,000. Override with `--no-limit` +- All DML reports progress via `IProgressReporter` + +### INSERT + +```sql +INSERT INTO account (name, revenue) VALUES ('Contoso', 1000000) +INSERT INTO account (name, revenue) SELECT name, revenue FROM account WHERE statecode = 0 +``` + +Pipelines source rows through `IBulkOperationExecutor.CreateMultipleAsync()`. + +### UPDATE + +```sql +UPDATE account SET revenue = revenue * 1.1 WHERE statecode = 0 +``` + +Two-phase: Retrieve target records via FetchXML, evaluate SET expressions per row, feed to `UpdateMultipleAsync()`. + +### DELETE + +```sql +DELETE FROM opportunity WHERE statecode = 2 AND actualclosedate < '2020-01-01' +``` + +Retrieve target IDs via FetchXML, feed to `DeleteMultipleAsync()`. + +--- + +## Phase 6: Metadata & Streaming + +**Goal:** Query Dataverse metadata via SQL, progressive result streaming in TUI. + +### Metadata Schema + +```sql +SELECT logicalname, displayname FROM metadata.entity WHERE iscustomentity = 1 +``` + +Virtual tables: `metadata.entity`, `metadata.attribute`, `metadata.relationship_1_n`, `metadata.relationship_n_n`, `metadata.optionset`, `metadata.optionsetvalue`. + +`MetadataScanNode` executes appropriate metadata requests and yields rows. + +### Progressive Result Streaming + +TUI consumes `IAsyncEnumerable` incrementally, refreshing UI every 100 rows. Combined with `PrefetchScanNode` for zero-wait pagination. + +--- + +## Phase 7: Advanced + +**Goal:** Window functions, variables, flow control. Power-user features with lower priority. + +### Window Functions + +```sql +SELECT name, revenue, + ROW_NUMBER() OVER (ORDER BY revenue DESC) AS rank, + SUM(revenue) OVER (PARTITION BY industrycode) AS industry_total +FROM account +``` + +Always client-side (FetchXML has no window function support). `ClientWindowNode` materializes all input rows, sorts/partitions, computes function values. + +### Variables + +```sql +DECLARE @threshold MONEY = 1000000 +SELECT name, revenue FROM account WHERE revenue > @threshold +``` + +Variables in WHERE clauses resolved at plan time (substituted as literals into FetchXML). + +### Flow Control + +IF/ELSE, WHILE, BEGIN/END blocks. Creates `ScriptExecutionNode` — a mini interpreter. Lowest priority; variables without flow control cover 80% of use cases. + +--- + +## Cross-Cutting Concerns + +### Error Handling + +New error codes: `QUERY_AGGREGATE_LIMIT`, `QUERY_DML_BLOCKED`, `QUERY_DML_ROW_CAP`, `QUERY_TDS_INCOMPATIBLE`, `QUERY_PLAN_TIMEOUT`, `QUERY_TYPE_MISMATCH`. + +### Cancellation + +All plan nodes accept `CancellationToken` via `QueryPlanContext`. TUI Ctrl+C sets the token. + +### Progress Reporting + +All plan nodes >1 second accept `IProgressReporter` via `QueryPlanContext`. + +### Memory Bounds + +Client-side nodes that materialize data (DistinctNode, ClientWindowNode, HashJoinNode) have configurable memory limits. Default: 500MB. + +--- + +## Testing Strategy + +### Unit Test Categories + +| Category | Scope | Infrastructure | +|----------|-------|----------------| +| `TuiUnit` | AST, parser, transpiler, expression evaluator | No mocks needed | +| `PlanUnit` | Plan construction, node logic | Mock IQueryExecutor, mock data | +| `IntegrationQuery` | End-to-end against live Dataverse | Real connection pool | +| `IntegrationTds` | TDS endpoint queries | Real TDS connection | + +### Regression Suite + +SQL conformance test suite: JSON file mapping SQL input to expected FetchXML output and expected result shapes. Run on every commit. diff --git a/docs/plans/2026-03-18-query-parity-project.md b/docs/plans/2026-03-18-query-parity-project.md new file mode 100644 index 000000000..005d5066b --- /dev/null +++ b/docs/plans/2026-03-18-query-parity-project.md @@ -0,0 +1,198 @@ +# Query Parity Project Plan + +> Project plan extracted from specs/query-parity.md during spec governance restructuring. +> Permanent domain behavior (hints, cross-env, Extension surface) has been absorbed into specs/query.md. +> This file retains project-tracking content: implementation phases, dead code cleanup, daemon migration steps, and error mapping. + +--- + +## Implementation Phases + +| Phase | Description | Dependencies | +|-------|-------------|--------------| +| 1 | Daemon uses `SqlQueryService` | None | +| 2 | Query hints integration in `SqlQueryService` | Phase 1 (daemon benefits immediately) | +| 3 | VS Code environment colors | None (can parallel with Phase 2) | +| 4 | Cross-environment query attribution | Phase 1 (needs data source collection from plan) | +| 5 | TDS Endpoint wiring (DI, reporting, UI, error handling) | Phase 1 (needs `SqlQueryService` integration) | + +--- + +## Phase 1: Daemon Uses SqlQueryService + +### Core Requirements + +1. `RpcMethodHandler.QuerySqlAsync()` delegates ALL query execution to `SqlQueryService.ExecuteAsync()` — no inline transpilation or direct executor calls +2. `RpcMethodHandler.QueryExplainAsync()` delegates to `SqlQueryService.ExplainAsync()` — no inline parser/planner/generator instantiation +3. `RpcMethodHandler.QueryExportAsync()` has two input paths: + - **SQL input** (`request.Sql`): Delegates transpilation to `SqlQueryService.TranspileSql()` and execution to `SqlQueryService.ExecuteAsync()` + - **FetchXML input** (`request.FetchXml`): Stays as-is — raw FetchXML is already the final format +4. The daemon wires `RemoteExecutorFactory`, `ProfileResolver`, `EnvironmentSafetySettings`, and `EnvironmentProtectionLevel` on the `SqlQueryService` instance, using the same pattern as `InteractiveSession.GetSqlQueryServiceAsync()` (lines 321-355) +5. The daemon's inline DML safety pre-check (lines 954-1001) is removed — `SqlQueryService.PrepareExecutionAsync()` handles DML safety +6. All bespoke helper methods are deleted (see Dead Code Cleanup section) +7. The `QueryResultResponse` RPC DTO contract is unchanged — `SqlQueryResult` maps to the existing fields +8. TDS routing is controlled via `SqlQueryRequest.UseTdsEndpoint` — the service handles TDS internally + +### Primary Flow + +**query/sql RPC call (execute mode):** + +1. **Validate**: Check `request.Sql` is non-empty +2. **Resolve environment**: Call `WithProfileAndEnvironmentAsync(request.EnvironmentUrl, ...)` to get service provider +3. **Get service**: Get `ISqlQueryService` from provider, cast to `SqlQueryService` +4. **Wire cross-env**: Load `EnvironmentConfigStore` configs, create `ProfileResolutionService`, set `RemoteExecutorFactory` (label -> resolve config -> get provider -> get `IQueryExecutor`) +5. **Wire safety**: Set `ProfileResolver`, `EnvironmentSafetySettings`, `EnvironmentProtectionLevel` from environment config +6. **Build request**: Map RPC `QuerySqlRequest` fields to `SqlQueryRequest` +7. **Execute**: Call `service.ExecuteAsync(sqlRequest, cancellationToken)` +8. **Map response**: Convert `SqlQueryResult` to `QueryResultResponse` +9. **Save history**: Fire-and-forget history save (unchanged) + +**query/sql RPC call (showFetchXml mode):** + +When `request.ShowFetchXml` is true, use `service.TranspileSql(request.Sql, request.Top)` instead of `ExecuteAsync()` and return the FetchXML string in the response. No plan building, no execution, no wiring needed. + +### Error Mapping + +| Service Exception | RPC Error Code | Mapping | +|-------------------|---------------|---------| +| `QueryParseException` | `ErrorCodes.Query.ParseError` | Wrap in `RpcException` with parse error details | +| `PpdsException` with `DmlBlocked` ErrorCode | `ErrorCodes.Query.DmlBlocked` | Wrap in `RpcException` with `DmlSafetyErrorData { DmlBlocked = true }` | +| `PpdsException` with `DmlConfirmationRequired` ErrorCode | `ErrorCodes.Query.DmlConfirmationRequired` | Wrap in `RpcException` with `DmlSafetyErrorData { DmlConfirmationRequired = true }` | +| `PpdsException` with `TdsIncompatible` ErrorCode | `ErrorCodes.Query.TdsIncompatible` | Wrap in `RpcException` with error message (Phase 5) | +| `PpdsException` with `TdsConnectionFailed` ErrorCode | `ErrorCodes.Query.TdsConnectionFailed` | Wrap in `RpcException` with error message (Phase 5) | +| `PpdsException` (other) | `ErrorCodes.Query.ExecutionFailed` | Wrap in `RpcException` with error message | +| Other exceptions | Standard StreamJsonRpc error | Let StreamJsonRpc handle (500-level equivalent) | + +### Dead Code Cleanup + +After switching to `SqlQueryService`, the following code in `RpcMethodHandler.cs` must be deleted: + +| Code | Lines | Reason | +|------|-------|--------| +| `TranspileSqlToFetchXml()` private method | 1339-1370 | Replaced by `SqlQueryService.TranspileSql()` | +| Inline DML safety check block | 954-1002 | Handled by `SqlQueryService.PrepareExecutionAsync()` | +| `InjectTopAttribute()` private method | 1456-1473 | Retained — still used by `query/fetch` handler for raw FetchXML TOP injection | +| Inline `QueryParser`/`ExecutionPlanBuilder`/`FetchXmlGeneratorService` in `QueryExplainAsync` | 1261-1266 | Replaced by `SqlQueryService.ExplainAsync()` | +| TDS executor inline construction in `QuerySqlAsync` | 1012-1024 | Handled by `SqlQueryService` TDS routing | +| `FormatExportContent()` private method | 1372-1427 | Move to shared `QueryExportFormatter` utility | +| `ExtractDisplayValue()` private method | 1429-1440 | Moves with `FormatExportContent` | +| `CsvEscape()` private method | 1442-1449 | Moves with `FormatExportContent` | +| `using Microsoft.SqlServer.TransactSql.ScriptDom` import | Line 24 | No longer needed in handler | +| `using PPDS.Query.Transpilation` import | Line 29 | No longer needed in handler | + +**Keep:** `MapToResponse()` (refactor to accept `SqlQueryResult`), `MapQueryValue()`, `FireAndForgetHistorySave()`. + +### Constraints + +- The RPC response contract (`QueryResultResponse`) must not change +- The daemon must create `ProfileResolutionService` per request (or cache and invalidate when environment configs change) +- `SqlQueryService` is registered as transient — each call gets a fresh instance, wiring is per-call + +--- + +## Phase 2: Query Hints Integration + +### Changes Per Layer + +1. **`SqlQueryService.PrepareExecutionAsync()`**: After `QueryParser.Parse()`, call `QueryHintParser.Parse(fragment)`. Merge overrides into `QueryPlanOptions`. Create `QueryExecutionOptions` from execution-level hints. +2. **`QueryPlanOptions`**: Add `ForceClientAggregation` (bool) and `NoLock` (bool). +3. **`QueryExecutionOptions`** (new type): `BypassPlugins` (bool), `BypassFlows` (bool). Stored on `QueryPlanContext`. +4. **`QueryPlanContext`**: Add `QueryExecutionOptions? ExecutionOptions` property. +5. **`FetchXmlScanNode`**: When `NoLock` is true, inject `no-lock="true"` attribute into FetchXML string before execution. +6. **`IQueryExecutor`**: Add new overload with default interface implementation (backward-compatible). +7. **`QueryExecutor`** (concrete): Override new overload, apply `BypassPlugins`/`BypassFlows` as OrganizationRequest headers. +8. **`FetchXmlScanNode`**: Call new overload, passing `context.ExecutionOptions`. +9. **`ExecutionPlanBuilder.PlanSelect()`**: When `ForceClientAggregation` is true, route to client-side hash group plan. + +--- + +## Phase 3: VS Code Environment Colors + +### Changes + +1. **Extension host** (`QueryPanel.ts`, `SolutionsPanel.ts`): Include `resolvedColor` from `daemon.envConfigGet()` in the `updateEnvironment` webview message. +2. **Message types** (`message-types.ts`): Add `envColor?: string` to `updateEnvironment` message. +3. **Webview** (`query-panel.ts`, `solutions-panel.ts`): On `updateEnvironment`, set `data-env-color` attribute on toolbar. CSS applies left border color. +4. **CSS**: Map terminal color names to CSS custom properties (13 colors: Red through BrightBlue). + +--- + +## Phase 4: Cross-Environment Query Attribution + +### Data Source Collection + +After `_planBuilder.Plan()` returns, walk `QueryPlanResult.RootNode` tree to collect `RemoteScanNode.RemoteLabel` values. Local environment is always present. + +**Local environment label resolution**: From environment config label, falling back to display name, then URL. + +### VS Code Banner + +When `dataSources` has 2+ entries, render above results grid: "Data from: PPDS Dev (local) / QA (remote)" with environment colors. + +--- + +## Phase 5: TDS Endpoint Wiring + +### DI Registration + +#### Daemon Path (`DaemonConnectionPoolManager.CreateProviderFromSources`) + +Pass `AuthProfile` and `environmentUrl` as parameters. Create `IPowerPlatformTokenProvider` from profile, wrap as `Func>`, register `ITdsQueryExecutor` as singleton per-provider. + +#### CLI Path (`ServiceRegistration.AddCliApplicationServices`) + +Resolve `ResolvedConnectionInfo` from DI, create token provider, register `ITdsQueryExecutor` as transient factory delegate. + +### Execution Mode Reporting + +- `SqlQueryResult.ExecutionMode` set by walking plan tree (any `TdsScanNode` -> Tds, else Dataverse) +- `SqlQueryStreamChunk.ExecutionMode` set on final chunk for TUI streaming +- `RpcMethodHandler` maps to `queryMode` RPC response field + +### TDS Failure Behavior + +- **Incompatible query**: `SqlQueryService.PrepareExecutionAsync()` pre-checks before planning. Throws `PpdsException(TdsIncompatible)`. +- **Connection failure**: `TdsScanNode` lets `SqlException` propagate. `SqlQueryService` wraps in `PpdsException(TdsConnectionFailed)`. +- **Design principle**: Never silently substitute Dataverse for TDS. + +### VS Code Extension Changes + +- Un-comment TDS Read Replica menu item in `query-panel.ts:511-512` +- TDS errors display via existing error banner (no special UI needed) + +### TUI Changes + +- Status label after execution reflects actual mode: "Returned N rows in Xms via TDS/Dataverse" +- TDS errors handled by existing `ErrorService.ReportError()` + +--- + +## Acceptance Criteria + +| ID | Criterion | Status | +|----|-----------|--------| +| AC-01 | Daemon `query/sql` executes via `SqlQueryService.ExecuteAsync()` | Pending | +| AC-02 | Daemon query results include virtual column expansion | Pending | +| AC-03 | `[LABEL].entity` syntax works in daemon | Pending | +| AC-04 | `-- ppds:USE_TDS` routes through TDS endpoint | Pending | +| AC-05 | `-- ppds:NOLOCK` produces `` | Pending | +| AC-06 | `-- ppds:BYPASS_PLUGINS` sets header | Pending | +| AC-07 | `-- ppds:BYPASS_FLOWS` sets header | Pending | +| AC-08-13 | Remaining hint ACs (MAX_ROWS, MAXDOP, HASH_GROUP, BATCH_SIZE, precedence, malformed) | Pending | +| AC-14-15 | VS Code environment colors | Pending | +| AC-16-18 | Data source attribution | Pending | +| AC-19 | Hints work identically across all interfaces | Pending | +| AC-20 | Daemon no longer contains inline SQL transpilation | Pending | +| AC-21-29 | Remaining Phase 1-4 ACs | Pending | +| AC-30-48 | Phase 5 TDS wiring ACs | Pending | + +--- + +## Follow-Up Work (Out of Scope) + +- GitHub issue in `ppds-docs` for query hints documentation page +- IntelliSense completions for `-- ppds:` prefix in SQL editor +- TDS toggle re-enabled in Phase 5 (pre-existing menu item) +- Environment color in QuickPick environment picker +- `IProgressReporter` for `SqlQueryService` (separate enhancement) +- `query/fetchxml` handler virtual column expansion (tracked separately) diff --git a/docs/plans/2026-03-18-spec-governance.md b/docs/plans/2026-03-18-spec-governance.md new file mode 100644 index 000000000..542cc96eb --- /dev/null +++ b/docs/plans/2026-03-18-spec-governance.md @@ -0,0 +1,164 @@ +# Spec Governance — Implementation Plan + +**Spec:** [specs/spec-governance.md](../../specs/spec-governance.md) +**Branch:** `feature/spec-governance` +**Scope:** Session 1 — governance infrastructure only. Session 2 (spec restructuring) is a separate plan. + +--- + +## Phase 1: Constitution Amendment + +**Goal:** Add Spec Laws (SL1–SL5) to the Constitution. + +**Tasks:** + +1.1. Add `## Spec Laws` section to `specs/CONSTITUTION.md` after Resource Laws, with laws SL1–SL5 as specified in the spec. + +1.2. Update `CLAUDE.md` Specs section: +- Remove `Index: specs/README.md` reference +- Add reference to `specs/CONSTITUTION.md` Spec Laws +- Keep existing pointers to Constitution and Template + +**Verification:** Read `specs/CONSTITUTION.md` and confirm SL1–SL5 present. Read `CLAUDE.md` and confirm updated references. + +**ACs covered:** AC-01, AC-17 + +--- + +## Phase 2: Spec Template Update + +**Goal:** Update the template with Surfaces frontmatter, Changelog, and stronger Code: guidance. + +**Tasks:** + +2.1. Update `specs/SPEC-TEMPLATE.md` frontmatter: +- Remove `**Version:** 1.0` line +- Add `**Surfaces:** CLI | TUI | Extension | MCP | All | N/A` line after `**Code:**` +- Update `**Code:**` line with guidance comment requiring grep-friendly path prefixes +- Update `**Status:**` to show `Draft | Implemented` + +2.2. Add `## Changelog` section to template (before Roadmap): +```markdown +## Changelog + +| Date | Change | +|------|--------| +| YYYY-MM-DD | Initial spec | +``` + +2.3. Add surface section guidance within the Specification section, showing the pattern for CLI/TUI/Extension/MCP subsections. + +**Verification:** Read template, confirm Version removed, Surfaces present, Changelog section present, Code: guidance strengthened. + +**ACs covered:** AC-02, AC-03, AC-04 + +--- + +## Phase 3: Skill Updates — README Migration + +**Goal:** Migrate 6 skills from `specs/README.md` lookup to frontmatter grep. Remove hardcoded mappings from `/implement`. + +**Tasks are independent — can be parallelized.** + +3.1. Update `.claude/commands/spec.md`: +- Step 1: Replace `specs/README.md` reference with frontmatter grep instruction +- Step 3: Replace hardcoded cross-reference rules with frontmatter-based discovery +- Add Step 6 instruction: regenerate `specs/README.md` after create/update/reconcile + +3.2. Update `.claude/commands/spec-audit.md`: +- Step 1: Replace `specs/README.md` reference with frontmatter grep +- Step 2 (full audit): Replace "Read specs/README.md to get list of all specs" with glob `specs/*.md` + +3.3. Update `.claude/commands/gates.md`: +- Replace `Read specs/README.md to map changed files to specs` with frontmatter grep instruction + +3.4. Update `.claude/commands/review.md`: +- Step 2: Replace `Map changed files to specs via specs/README.md` with frontmatter grep + +3.5. Update `.claude/commands/implement.md`: +- Step 2B: Replace README reference and hardcoded code-path-to-spec mappings with frontmatter grep +- Add instruction: when creating new code paths for a spec, update the spec's `**Code:**` frontmatter + +3.6. Update `.claude/skills/ext-panels/SKILL.md`: +- Replace `specs/README.md` reference with frontmatter grep + +**Verification:** Grep all updated skills for "README" — zero matches for spec discovery (README may still appear in the context of "regenerate README" in /spec). Grep for hardcoded path mappings in /implement — zero matches. + +**ACs covered:** AC-07, AC-08, AC-09, AC-10, AC-11, AC-12, AC-16 + +--- + +## Phase 4: Enhanced /spec — Reconcile Mode + +**Goal:** Add reconcile mode to the /spec skill. + +**Tasks:** + +4.1. Update `.claude/commands/spec.md` Step 2 (Determine Mode): +- Add third mode: reconcile. Triggered when spec exists and user requests alignment with code, or when significant code divergence is detected. + +4.2. Add reconcile mode behavior to `/spec`: +- Read spec ACs, Code paths, type descriptions +- Read code at Code: paths +- Enumerate public types, methods, behaviors +- Compare against existing ACs (missing, stale, status drift) +- Member-count verification: count public surface area vs AC coverage, warn if <90% +- Present proposed changes for user approval +- After writing: regenerate README, append to Changelog + +**Verification:** Read `/spec` skill, confirm three modes documented with reconcile behavior including member-count verification step. + +**ACs covered:** AC-13, AC-14 + +--- + +## Phase 5: Auto-Generated README + +**Goal:** Generate the initial README from frontmatter of all existing specs. + +**Tasks:** + +5.1. Write a generation script (`scripts/generate-spec-readme.py`) that: +- Globs `specs/*.md` (excluding CONSTITUTION.md, SPEC-TEMPLATE.md, README.md) +- Extracts from each: H1 title, Status, Surfaces (or "—" if missing), Code paths, first sentence of Overview +- Generates markdown table sorted alphabetically +- Writes to `specs/README.md` with auto-generated comment header + +5.2. Run the script to generate the initial README. + +5.3. Add instruction to `/spec` skill (Step 6) to run the generation script after spec operations. + +**Note:** Existing specs don't yet have `**Surfaces:**` frontmatter — the generated README will show "—" for that column until session 2 adds it. This is expected and acceptable. + +**Verification:** Run the script. Count specs in generated README vs `ls specs/*.md` count. Verify all match. Confirm auto-generated comment present. + +**ACs covered:** AC-05, AC-06, AC-15 + +--- + +## Phase 6: Commit & Gates + +After all phases complete: + +6.1. Commit all changes. +6.2. Run `/gates` to verify nothing is broken (build, lint, test). + +--- + +## Dependency Graph + +``` +Phase 1 (Constitution) ──┐ + ├──▶ Phase 3 (Skill Updates) ──▶ Phase 4 (/spec Reconcile) +Phase 2 (Template) ──────┘ │ + ▼ + Phase 5 (README Gen) + │ + ▼ + Phase 6 (Commit & Gates) +``` + +Phases 1 and 2 are independent (can be parallelized). +Phase 3 tasks (3.1–3.6) are independent (can be parallelized). +Phase 4 depends on Phase 3.1 (the /spec skill must be migrated before adding reconcile mode). +Phase 5 depends on Phase 2 (template must be updated so the script knows what to scrape) and Phase 4 (the /spec regeneration instruction must be in place). diff --git a/scripts/generate-spec-readme.py b/scripts/generate-spec-readme.py new file mode 100644 index 000000000..e31bc27c5 --- /dev/null +++ b/scripts/generate-spec-readme.py @@ -0,0 +1,151 @@ +#!/usr/bin/env python3 +""" +Generate specs/README.md from spec frontmatter. + +Scrapes H1, Status, Surfaces, Code, and first Overview sentence from each +spec file and produces a sorted markdown table. The README is never +hand-edited — this script is the single source of truth. + +Usage: python scripts/generate-spec-readme.py +""" +import os +import re +import sys + + +def extract_frontmatter(filepath): + """Extract metadata from a spec file's frontmatter and overview.""" + try: + with open(filepath, "r", encoding="utf-8") as f: + content = f.read() + except (OSError, UnicodeDecodeError) as e: + print(f"Warning: cannot read {filepath}: {e}", file=sys.stderr) + return None + + lines = content.split("\n") + + # H1 title + title = None + for line in lines: + if line.startswith("# "): + title = line[2:].strip() + break + + if not title: + return None + + # Status + status = "—" + for line in lines: + m = re.match(r"\*\*Status:\*\*\s*(.*)", line) + if m: + status = m.group(1).strip() + break + + # Surfaces + surfaces = "—" + for line in lines: + m = re.match(r"\*\*Surfaces:\*\*\s*(.*)", line) + if m: + surfaces = m.group(1).strip() + break + + # Code paths + code = "—" + for line in lines: + m = re.match(r"\*\*Code:\*\*\s*(.*)", line) + if m: + raw = m.group(1).strip() + # Extract path prefixes from markdown links like [path/](../path/) + paths = re.findall(r"\[([^\]]+)\]\([^)]+\)", raw) + if paths: + code = ", ".join(f"`{p}`" for p in paths) + elif raw.lower() not in ("none", ""): + code = raw + break + + # First sentence of Overview section + purpose = "—" + in_overview = False + for line in lines: + if line.strip() == "## Overview": + in_overview = True + continue + if in_overview and line.strip() and not line.startswith("#"): + # Take entire first line — more robust than sentence splitting + purpose = line.strip() + break + + # Escape pipe characters to prevent breaking markdown table + status = status.replace("|", "\\|") + surfaces = surfaces.replace("|", "\\|") + code = code.replace("|", "\\|") + purpose = purpose.replace("|", "\\|") + + return { + "title": title, + "status": status, + "surfaces": surfaces, + "code": code, + "purpose": purpose, + } + + +def main(): + # Determine specs directory + script_dir = os.path.dirname(os.path.abspath(__file__)) + repo_root = os.path.dirname(script_dir) + specs_dir = os.path.join(repo_root, "specs") + + if not os.path.isdir(specs_dir): + print(f"Error: specs directory not found at {specs_dir}", file=sys.stderr) + sys.exit(1) + + # Exclusions + exclude = {"CONSTITUTION.md", "SPEC-TEMPLATE.md", "README.md"} + + # Collect spec metadata + specs = [] + for filename in sorted(os.listdir(specs_dir)): + if not filename.endswith(".md") or filename in exclude: + continue + filepath = os.path.join(specs_dir, filename) + meta = extract_frontmatter(filepath) + if meta: + meta["filename"] = filename + specs.append(meta) + + # Sort alphabetically by filename + specs.sort(key=lambda s: s["filename"]) + + # Generate README + lines = [ + "# Specifications", + "", + "> **Constitution:** All specs must comply with [CONSTITUTION.md](./CONSTITUTION.md).", + "> See Spec Laws (SL1–SL5) for naming, lifecycle, and granularity rules.", + "", + "", + "", + "| Spec | Status | Surfaces | Code | Purpose |", + "|------|--------|----------|------|---------|", + ] + + for spec in specs: + name = spec["filename"] + link = f"[{name}](./{name})" + lines.append( + f"| {link} | {spec['status']} | {spec['surfaces']} | {spec['code']} | {spec['purpose']} |" + ) + + lines.append("") + + readme_path = os.path.join(specs_dir, "README.md") + with open(readme_path, "w", encoding="utf-8", newline="\n") as f: + f.write("\n".join(lines)) + + print(f"Generated {readme_path} with {len(specs)} specs.", file=sys.stderr) + + +if __name__ == "__main__": + main() diff --git a/scripts/workflow-state.py b/scripts/workflow-state.py new file mode 100644 index 000000000..20f1d8bb3 --- /dev/null +++ b/scripts/workflow-state.py @@ -0,0 +1,142 @@ +#!/usr/bin/env python3 +""" +Canonical workflow state utility — the ONLY way to write .workflow/state.json. +Skills and commands call this script instead of writing JSON by hand. + +Usage: + python scripts/workflow-state.py set gates.passed now + python scripts/workflow-state.py set gates.commit_ref abc123def + python scripts/workflow-state.py set review.findings 3 + python scripts/workflow-state.py set pr.gemini_triaged true + python scripts/workflow-state.py set-null gates.passed + python scripts/workflow-state.py init feature/my-branch + python scripts/workflow-state.py show + python scripts/workflow-state.py delete + +Magic values for 'set': + now → current UTC ISO 8601 timestamp + true → boolean true + false → boolean false + digits → integer +""" +import json +import os +import sys +from datetime import datetime, timezone + + +def get_state_path(): + project_dir = os.environ.get("CLAUDE_PROJECT_DIR", os.getcwd()) + state_dir = os.path.join(project_dir, ".workflow") + os.makedirs(state_dir, exist_ok=True) + return os.path.join(state_dir, "state.json") + + +def read_state(): + path = get_state_path() + if not os.path.exists(path): + return {} + try: + with open(path, "r") as f: + return json.load(f) + except (json.JSONDecodeError, OSError): + return {} + + +def write_state(state): + path = get_state_path() + with open(path, "w") as f: + json.dump(state, f, indent=2) + f.write("\n") + print(f"Updated {path}", file=sys.stderr) + + +def coerce_value(raw): + """Convert string value to appropriate Python type.""" + if raw == "now": + return datetime.now(timezone.utc).isoformat() + if raw == "true": + return True + if raw == "false": + return False + try: + return int(raw) + except ValueError: + return raw + + +def set_nested(state, key, value): + """Set a dotted key path (e.g., 'gates.passed') in the state dict.""" + parts = key.split(".") + current = state + for part in parts[:-1]: + if part not in current or not isinstance(current[part], dict): + current[part] = {} + current = current[part] + current[parts[-1]] = value + + +def main(): + if len(sys.argv) < 2: + print( + "Usage: workflow-state.py [args...]\n" + "Commands: set , set-null , init , show, delete", + file=sys.stderr, + ) + sys.exit(1) + + command = sys.argv[1] + + if command == "show": + state = read_state() + print(json.dumps(state, indent=2)) + sys.exit(0) + + if command == "delete": + path = get_state_path() + if os.path.exists(path): + os.remove(path) + print(f"Deleted {path}", file=sys.stderr) + else: + print("No workflow state to clear.", file=sys.stderr) + sys.exit(0) + + if command == "init": + if len(sys.argv) < 3: + print("Usage: workflow-state.py init ", file=sys.stderr) + sys.exit(1) + branch = sys.argv[2] + state = { + "branch": branch, + "started": datetime.now(timezone.utc).isoformat(), + } + write_state(state) + sys.exit(0) + + if command == "set": + if len(sys.argv) < 4: + print("Usage: workflow-state.py set ", file=sys.stderr) + sys.exit(1) + key = sys.argv[2] + value = coerce_value(sys.argv[3]) + state = read_state() + set_nested(state, key, value) + write_state(state) + sys.exit(0) + + if command == "set-null": + if len(sys.argv) < 3: + print("Usage: workflow-state.py set-null ", file=sys.stderr) + sys.exit(1) + key = sys.argv[2] + state = read_state() + set_nested(state, key, None) + write_state(state) + sys.exit(0) + + print(f"Unknown command: {command}", file=sys.stderr) + sys.exit(1) + + +if __name__ == "__main__": + main() diff --git a/specs/CONSTITUTION.md b/specs/CONSTITUTION.md index 1a3021db3..a38c18329 100644 --- a/specs/CONSTITUTION.md +++ b/specs/CONSTITUTION.md @@ -57,3 +57,15 @@ Non-negotiable principles. Every spec, plan, implementation, and review MUST com **R2.** `CancellationToken` must be threaded through the entire async call chain — never accepted as a parameter and then ignored. If a method accepts a token, it must pass it to every async call it makes. **R3.** Event handlers and subscriptions must be cleaned up in `Dispose`. Every `+=` needs a corresponding `-=`. Every `.subscribe()` needs an `.unsubscribe()` or disposal mechanism. + +## Spec Laws + +**SL1.** One spec per domain concept, named after the thing — not the project, surface, or enhancement. `plugin-traces.md` not `tui-plugin-traces.md`. `data-explorer.md` not `vscode-data-explorer-monaco-editor.md`. Cross-cutting architectural patterns are legitimate standalone specs. + +**SL2.** Specs are living documents — updated as the feature evolves. Plans (`docs/plans/`) are ephemeral and consumed by implementation. Project coordination documents (parity, polish, audit) are plans, not specs. + +**SL3.** Surface-specific behavior (TUI screen layout, Extension panel wiring, MCP tool schema) lives in surface sections within the domain spec, not in separate spec files. + +**SL4.** Spec frontmatter `**Code:**` must contain grep-friendly path prefixes, not prose like "Multiple (see spec)". Every spec with an implementation must have at least one code path. System-wide specs (architecture, governance) use `**Code:** System-wide`. + +**SL5.** Specs for removed features are deleted. No archival, no deprecation ceremony, no tombstones. diff --git a/specs/README.md b/specs/README.md index 13d88f852..96bb567d8 100644 --- a/specs/README.md +++ b/specs/README.md @@ -1,41 +1,37 @@ # Specifications -> **Constitution:** All specs must comply with [CONSTITUTION.md](./CONSTITUTION.md). All specs must have numbered acceptance criteria (AC-01, AC-02, ...) before implementation. - -Generated by spec-gen. - -## Core - -| Spec | Code | Purpose | -|------|------|---------| -| [architecture.md](./architecture.md) | [src/](../src/) | System architecture and cross-cutting patterns | -| [authentication.md](./authentication.md) | [src/PPDS.Auth/](../src/PPDS.Auth/) | Authentication, credential management, and discovery | -| [connection-pooling.md](./connection-pooling.md) | [src/PPDS.Dataverse/Pooling/](../src/PPDS.Dataverse/Pooling/) | Dataverse connection pooling and resilience | - -## Features - -| Spec | Code | Purpose | -|------|------|---------| -| [analyzers.md](./analyzers.md) | [src/PPDS.Analyzers/](../src/PPDS.Analyzers/) | Roslyn analyzers for code quality | -| [bulk-operations.md](./bulk-operations.md) | [src/PPDS.Dataverse/BulkOperations/](../src/PPDS.Dataverse/BulkOperations/) | High-performance bulk data operations | -| [cli.md](./cli.md) | [src/PPDS.Cli/Commands/](../src/PPDS.Cli/Commands/) | Command Line Interface structure and commands | -| [dataverse-services.md](./dataverse-services.md) | [src/PPDS.Dataverse/Services/](../src/PPDS.Dataverse/Services/) | High-level Dataverse domain services | -| [mcp.md](./mcp.md) | [src/PPDS.Mcp/](../src/PPDS.Mcp/) | Model Context Protocol server implementation | -| [migration.md](./migration.md) | [src/PPDS.Migration/](../src/PPDS.Migration/) | Data migration engine (CMT format) | -| [plugins.md](./plugins.md) | [src/PPDS.Plugins/](../src/PPDS.Plugins/) | Plugin extraction and registration | -| [query.md](./query.md) | [src/PPDS.Dataverse/Query/](../src/PPDS.Dataverse/Query/) | SQL-to-FetchXml transpilation and execution | -| [tui.md](./tui.md) | [src/PPDS.Cli/Tui/](../src/PPDS.Cli/Tui/) | Terminal User Interface | -| [per-panel-environment-scoping.md](./per-panel-environment-scoping.md) | [src/PPDS.Extension/src/](../src/PPDS.Extension/src/), [src/PPDS.Cli/](../src/PPDS.Cli/) | Per-panel environment targeting for VS Code extension | -| [panel-parity.md](./panel-parity.md) | Multiple (see spec) | Feature parity — 6 panels across RPC, VS Code, TUI, MCP | -| [query-parity.md](./query-parity.md) | [src/PPDS.Cli/Services/Query/](../src/PPDS.Cli/Services/Query/), [src/PPDS.Extension/src/](../src/PPDS.Extension/src/) | Align daemon query path to shared SqlQueryService, hints, cross-env, colors | -| [update-check.md](./update-check.md) | [src/PPDS.Cli/Services/UpdateCheck/](../src/PPDS.Cli/Services/UpdateCheck/) | CLI update check, notification, and self-update | - -## Process - -| Spec | Code | Purpose | -|------|------|---------| -| [workflow-enforcement.md](./workflow-enforcement.md) | [.claude/](.claude/) | Mechanical workflow enforcement — hooks, state tracking, skill updates | - -## Status - -See individual spec files for implementation progress. +> **Constitution:** All specs must comply with [CONSTITUTION.md](./CONSTITUTION.md). +> See Spec Laws (SL1–SL5) for naming, lifecycle, and granularity rules. + + + +| Spec | Status | Surfaces | Code | Purpose | +|------|--------|----------|------|---------| +| [analyzers.md](./analyzers.md) | Partial (3 of 13 rules implemented) | N/A | `src/PPDS.Analyzers/` | PPDS.Analyzers is a Roslyn-based static code analysis package that enforces architectural patterns and best practices at compile time. Analyzers run during compilation and surface warnings in the IDE, catching common mistakes before code is committed. | +| [architecture.md](./architecture.md) | Implemented | — | `src/` | PPDS is a TUI-first multi-interface platform for Power Platform development. All business logic resides in Application Services, enabling CLI, TUI, RPC, and MCP interfaces to share identical behavior through a single code path. | +| [authentication.md](./authentication.md) | Implemented | All | `src/PPDS.Auth/` | The authentication system provides secure credential management, multi-method authentication, and environment discovery for Power Platform development. It supports nine authentication methods spanning interactive user flows, service principals, managed identities, and federated workload identities, with platform-native secure storage for secrets. | +| [bulk-operations.md](./bulk-operations.md) | Implemented | CLI, TUI, MCP | `src/PPDS.Dataverse/BulkOperations/` | The bulk operations system provides high-throughput data manipulation using Dataverse's native bulk APIs (`CreateMultiple`, `UpdateMultiple`, `UpsertMultiple`, `DeleteMultiple`). It delivers up to 5x better performance than `ExecuteMultiple` by leveraging modern bulk endpoints with automatic batching, parallel execution, throttle handling, and diagnostic analysis for batch failures. | +| [cli.md](./cli.md) | Implemented | CLI | `src/PPDS.Cli/Commands/` | The PPDS CLI provides a command-line interface for Power Platform development operations. Built on System.CommandLine, it offers 18 command groups for managing authentication, querying Dataverse, deploying plugins, and automating DevOps workflows. All commands delegate to Application Services for business logic, ensuring consistency with TUI and RPC interfaces. | +| [connection-pooling.md](./connection-pooling.md) | Implemented | All | `src/PPDS.Dataverse/Pooling/`, `src/PPDS.Dataverse/Resilience/` | The connection pooling system manages Dataverse client lifecycle with intelligent load distribution and throttle awareness. It maintains a pool of pre-authenticated connections across multiple Application Users, automatically routing requests away from throttled connections and handling service protection errors transparently. | +| [connection-references.md](./connection-references.md) | Implemented | CLI, TUI, Extension, MCP | `src/PPDS.Dataverse/Services/IConnectionReferenceService.cs`, `src/PPDS.Extension/src/panels/ConnectionReferencesPanel.ts`, `src/PPDS.Cli/Tui/Screens/ConnectionReferencesScreen.cs` | View connection references in an environment, see which Power Platform connections they're bound to (with health status via Connections API), understand flow dependencies, and detect orphaned references. Key tool for connection health monitoring and deployment troubleshooting. | +| [data-explorer.md](./data-explorer.md) | Draft | Extension, TUI, MCP | `src/PPDS.Extension/src/panels/QueryPanel.ts` | The Data Explorer provides interactive query execution and result browsing across all PPDS surfaces (VS Code webview, TUI, MCP). This spec covers the VS Code panel enhancements including Monaco editor integration and rectangular selection/copy UX. | +| [dataverse-services.md](./dataverse-services.md) | Implemented | All | `src/PPDS.Dataverse/Services/`, `src/PPDS.Dataverse/Metadata/` | The Dataverse Services layer provides domain-specific operations for Dataverse entities through a consistent interface pattern. These services abstract SDK complexity, provide type-safe DTOs, and integrate with the connection pool for efficient resource usage. | +| [environment-dashboard.md](./environment-dashboard.md) | Draft | TUI | — | The Environment Dashboard provides an at-a-glance view of environment administration state, combining five Dataverse services into a single unified interface. Users navigate between sub-views for Users & Roles, Cloud Flows, Environment Variables, and Connection References. Each sub-view exposes a filterable table with a detail panel showing properties of the selected item. | +| [environment-variables.md](./environment-variables.md) | Implemented | CLI, TUI, Extension, MCP | `src/PPDS.Dataverse/Services/IEnvironmentVariableService.cs`, `src/PPDS.Extension/src/panels/EnvironmentVariablesPanel.ts`, `src/PPDS.Cli/Tui/Screens/EnvironmentVariablesScreen.cs` | View environment variable definitions and their current values, update values with type-aware validation, and export deployment settings. Key tool for deployment troubleshooting — surfaces the gap between default and current values, highlights missing required variables, and enables AI agents to fix misconfigurations. | +| [ext-verify-tool.md](./ext-verify-tool.md) | Draft | Extension | `src/PPDS.Extension/tools/` | A CLI tool that launches and controls VS Code via Playwright's Electron integration, providing AI agents with full visual and interactive access to extension webview panels. Enables screenshots, DOM interaction, command palette execution, keyboard shortcuts, console log capture, and output channel reading — closing the feedback loop between implementation and visual verification. | +| [import-jobs.md](./import-jobs.md) | Implemented | CLI, TUI, Extension, MCP | `src/PPDS.Dataverse/Services/IImportJobService.cs`, `src/PPDS.Extension/src/panels/ImportJobsPanel.ts`, `src/PPDS.Cli/Tui/Screens/ImportJobsScreen.cs` | View solution import history for an environment. Shows import status, progress, duration, and allows drilling into the XML import log for troubleshooting failed imports. | +| [mcp.md](./mcp.md) | Implemented | MCP | `src/PPDS.Mcp/` | The MCP (Model Context Protocol) server exposes Power Platform/Dataverse capabilities to AI assistants like Claude Code. It provides 13 tools for authentication, environment management, schema discovery, query execution, and plugin debugging—enabling natural language interaction with Dataverse through the standardized MCP protocol over stdio transport. | +| [metadata-browser.md](./metadata-browser.md) | Implemented | CLI, TUI, Extension, MCP | `src/PPDS.Dataverse/Services/IMetadataService.cs`, `src/PPDS.Extension/src/panels/MetadataBrowserPanel.ts`, `src/PPDS.Cli/Tui/Screens/MetadataExplorerScreen.cs` | Browse entity definitions, attributes, relationships, keys, and privileges. The schema exploration tool for understanding the Dataverse data model without leaving the IDE or TUI. | +| [migration.md](./migration.md) | Implemented | CLI, TUI | `src/PPDS.Migration/`, `src/PPDS.Tui/Screens/MigrationScreen.cs` | The migration system provides high-performance data export and import between Dataverse environments. It uses dependency analysis to determine correct import ordering, handles circular references through deferred field processing, and supports the Configuration Migration Tool (CMT) format for interoperability with Microsoft tooling. | +| [per-panel-environment-scoping.md](./per-panel-environment-scoping.md) | Draft | — | `src/PPDS.Extension/src/`, `src/PPDS.Cli/Commands/Serve/Handlers/`, `src/PPDS.Cli/Tui/` | Per-panel environment scoping replaces the global active-environment model with panel-level environment targeting. Each Data Explorer and Solutions panel has its own environment picker in the webview header, enabling side-by-side comparison of environments. The daemon RPC methods gain an optional `environmentUrl` parameter so panels can query any environment without changing the profile's saved environment. A related TUI fix removes the broken `CloseAllTabs` behavior on profile switch. | +| [plugin-traces.md](./plugin-traces.md) | Implemented | CLI, TUI, Extension, MCP | `src/PPDS.Dataverse/Services/IPluginTraceService.cs`, `src/PPDS.Cli/Commands/PluginTraces/`, `src/PPDS.Extension/src/panels/PluginTracesPanel.ts`, `src/PPDS.Cli/Tui/Screens/PluginTracesScreen.cs` | The plugin traces system provides querying, inspection, and management of Dataverse plugin trace logs. It supports filtered listing, detailed trace inspection, execution timeline visualization with depth-based hierarchy, trace log settings management, and bulk deletion with progress reporting. Available across all four PPDS surfaces — CLI, TUI, VS Code extension, and MCP. | +| [plugins.md](./plugins.md) | Implemented | CLI, TUI | `src/PPDS.Plugins/`, `src/PPDS.Cli/Plugins/` | The plugin system enables code-first registration of Dataverse plugins using declarative attributes. Developers annotate plugin classes with `PluginStepAttribute` and `PluginImageAttribute`, then use CLI commands to extract metadata and deploy registrations to Dataverse environments. | +| [query.md](./query.md) | Implemented | CLI, TUI, Extension, MCP | `src/PPDS.Dataverse/Query/`, `src/PPDS.Dataverse/Sql/`, `src/PPDS.Cli/Services/Query/`, `src/PPDS.Cli/Services/History/`, `src/PPDS.Cli/Commands/Serve/Handlers/`, `src/PPDS.Extension/src/` | The query system enables SQL-like queries against Dataverse through automatic transpilation to FetchXML. It provides a full query pipeline: parsing SQL into an AST, transpiling to FetchXML with virtual column support, executing against Dataverse via the connection pool, and expanding results with formatted values. Query history tracks executed queries per environment for recall and re-execution. | +| [setup-command.md](./setup-command.md) | Draft | — | `.claude/commands/setup.md` | The `/setup` command configures a complete PPDS development environment from scratch or refreshes an existing one. It serves three audiences: developers on new machines, new contributors joining the project, and AI agents in fresh Claude Code sessions. Supports Windows and Linux (dev containers). | +| [solutions.md](./solutions.md) | Draft | CLI, TUI, Extension | `src/PPDS.Dataverse/Services/SolutionService.cs`, `src/PPDS.Extension/src/panels/SolutionsPanel.ts` | Solutions management across all PPDS surfaces — browsing, component inspection, export, import monitoring. The Solutions domain covers listing Dataverse solutions, resolving component types and names to human-readable labels, exporting solutions, and tracking import job progress. Surface-specific behavior (TUI screen layout, Extension webview panel, CLI commands) is defined in dedicated sections below. | +| [spec-governance.md](./spec-governance.md) | Draft | All | System-wide | The PPDS specs directory has grown to 32 files with no governing philosophy. Feature specs (living — "what Plugin Traces should be") and project specs (ephemeral — "build 6 panels for parity") are mixed together. Surface-specific duplicates exist (`plugin-traces.md` AND `tui-plugin-traces.md`). Enhancement specs exist for implementation details (`vscode-data-explorer-monaco-editor.md`). The README index lists only half the specs on disk. No guidance exists for naming, lifecycle, granularity, or what happens after a spec reaches "Implemented." | +| [tui-verify-tool.md](./tui-verify-tool.md) | Draft | — | `tests/PPDS.Tui.E2eTests/tools/` | A CLI tool that lets AI agents launch the PPDS TUI in a PTY, send keystrokes, read terminal text, and wait for content — closing the feedback loop between TUI implementation and verification. Mirrors the `webview-cdp.mjs` pattern but targets Terminal.Gui applications instead of VS Code webviews. | +| [tui.md](./tui.md) | Implemented | TUI | `src/PPDS.Cli/Tui/` | The PPDS TUI provides an interactive terminal interface for Power Platform development. Built on Terminal.Gui 1.19+, it offers a visual environment for SQL queries, profile management, and environment switching. The TUI shares Application Services with CLI and RPC, ensuring consistent behavior across all interfaces. | +| [update-check.md](./update-check.md) | Implemented (partial — self-update not yet built) | CLI | `src/PPDS.Cli/Services/UpdateCheck/`, `src/PPDS.Cli/Commands/VersionCommand.cs`, `src/PPDS.Cli/Infrastructure/StartupUpdateNotifier.cs` | The update check system keeps PPDS CLI users informed about new versions and provides a mechanism to update in place. It queries the NuGet flat-container API, caches results locally for 24 hours, and surfaces notifications at startup and on-demand via `ppds version`. Users on pre-release tracks see both stable and pre-release update options. | +| [web-resources.md](./web-resources.md) | Implemented | CLI, TUI, Extension, MCP | `src/PPDS.Dataverse/Services/IWebResourceService.cs`, `src/PPDS.Extension/src/panels/WebResourcesPanel.ts`, `src/PPDS.Cli/Tui/Screens/WebResourcesScreen.cs` | Browse, view, edit, and publish web resources. Features a FileSystemProvider for in-editor editing with auto-publish notification, conflict detection, and unpublished change detection. The most complex panel at the VS Code layer due to the full save-conflict-diff-resolve-publish workflow. | +| [workflow-enforcement.md](./workflow-enforcement.md) | Draft | — | `.claude/`, `scripts/hooks/` | A mechanical enforcement system that ensures AI agents follow the PPDS development workflow — from design through PR creation — without human micromanagement. Skills define the required steps. Hooks enforce that steps actually happened before allowing commits and PRs. A workflow state file tracks progress. | diff --git a/specs/SPEC-TEMPLATE.md b/specs/SPEC-TEMPLATE.md index b2a0633d3..60d9c01bc 100644 --- a/specs/SPEC-TEMPLATE.md +++ b/specs/SPEC-TEMPLATE.md @@ -1,9 +1,17 @@ # {System Name} **Status:** Draft | Implemented -**Version:** 1.0 **Last Updated:** YYYY-MM-DD -**Code:** [path/](../path/) | None +**Code:** [path/](../path/) | None | System-wide +**Surfaces:** CLI | TUI | Extension | MCP | All | N/A + +{**Code** must be grep-friendly path prefixes — not prose like "Multiple (see spec)".} +{Every spec with an implementation must have at least one code path. See Constitution SL4.} +{Examples:} +{- Good: `[src/PPDS.Dataverse/Services/](../src/PPDS.Dataverse/Services/)`} +{- Good: `[src/PPDS.Cli/Commands/PluginTraces/](../src/PPDS.Cli/Commands/PluginTraces/)`} +{- Bad: `Multiple — see per-panel sections`} +{- Bad: `None` (when code exists)} --- @@ -68,6 +76,26 @@ 2. **{Step}**: {Description} 3. **{Step}**: {Description} +### Surface-Specific Behavior + +{Include subsections only for surfaces with meaningful behavior beyond the Application Service.} + +#### CLI Surface + +{CLI-specific commands, flags, output format.} + +#### TUI Surface + +{Screen layout, keyboard navigation, filter dialogs.} + +#### Extension Surface + +{Panel wiring, webview behavior, VS Code integration.} + +#### MCP Surface + +{Tool schema, response format.} + ### Constraints - {Constraint the implementation must follow} @@ -253,6 +281,14 @@ public class New{Thing} : I{Thing} --- +## Changelog + +| Date | Change | +|------|--------| +| YYYY-MM-DD | Initial spec | + +--- + ## Roadmap {Optional section for future enhancements.} diff --git a/specs/analyzers.md b/specs/analyzers.md index f99725b3e..e28ab739f 100644 --- a/specs/analyzers.md +++ b/specs/analyzers.md @@ -1,9 +1,9 @@ # Roslyn Analyzers **Status:** Partial (3 of 13 rules implemented) -**Version:** 1.0 **Last Updated:** 2026-01-27 **Code:** [src/PPDS.Analyzers/](../src/PPDS.Analyzers/) +**Surfaces:** N/A --- @@ -452,6 +452,12 @@ public async Task PPDS012_TaskResult_ReportsWarning() --- +## Changelog + +| Date | Change | +|------|--------| +| 2026-03-18 | Added Surfaces frontmatter, Changelog per spec governance | + ## Roadmap - Code fix providers for auto-remediation diff --git a/specs/authentication.md b/specs/authentication.md index e31f61f1a..f64f0dcd3 100644 --- a/specs/authentication.md +++ b/specs/authentication.md @@ -1,9 +1,9 @@ # Authentication **Status:** Implemented -**Version:** 1.0 **Last Updated:** 2026-01-23 **Code:** [src/PPDS.Auth/](../src/PPDS.Auth/) +**Surfaces:** All --- @@ -568,6 +568,12 @@ public void CloudEndpoints_ReturnsCorrectAuthority_ForEachCloud() --- +## Changelog + +| Date | Change | +|------|--------| +| 2026-03-18 | Added Surfaces frontmatter, Changelog per spec governance | + ## Roadmap - Certificate auto-renewal detection and prompting diff --git a/specs/bulk-operations.md b/specs/bulk-operations.md index c99eeab18..31c51c682 100644 --- a/specs/bulk-operations.md +++ b/specs/bulk-operations.md @@ -1,9 +1,9 @@ # Bulk Operations **Status:** Implemented -**Version:** 1.0 **Last Updated:** 2026-01-23 **Code:** [src/PPDS.Dataverse/BulkOperations/](../src/PPDS.Dataverse/BulkOperations/) +**Surfaces:** CLI, TUI, MCP --- @@ -530,6 +530,12 @@ public async Task Progress_ReportsAfterEachBatch() --- +## Changelog + +| Date | Change | +|------|--------| +| 2026-03-18 | Added Surfaces frontmatter, Changelog per spec governance | + ## Roadmap - Automatic elastic table detection (currently caller must specify) diff --git a/specs/cli.md b/specs/cli.md index 557221ef7..8f4ac75fc 100644 --- a/specs/cli.md +++ b/specs/cli.md @@ -1,9 +1,9 @@ # CLI **Status:** Implemented -**Version:** 1.0 **Last Updated:** 2026-01-23 **Code:** [src/PPDS.Cli/Commands/](../src/PPDS.Cli/Commands/) +**Surfaces:** CLI --- @@ -515,6 +515,12 @@ public async Task ExitCode_IsAuthError_WhenProfileNotFound() --- +## Changelog + +| Date | Change | +|------|--------| +| 2026-03-18 | Added Surfaces frontmatter, Changelog per spec governance | + ## Roadmap - Tab completion support for common shells (bash, zsh, PowerShell) diff --git a/specs/connection-pooling.md b/specs/connection-pooling.md index 9395c1230..66ef7f0ab 100644 --- a/specs/connection-pooling.md +++ b/specs/connection-pooling.md @@ -1,9 +1,9 @@ # Connection Pooling **Status:** Implemented -**Version:** 1.0 **Last Updated:** 2026-01-23 **Code:** [src/PPDS.Dataverse/Pooling/](../src/PPDS.Dataverse/Pooling/), [src/PPDS.Dataverse/Resilience/](../src/PPDS.Dataverse/Resilience/) +**Surfaces:** All --- @@ -514,6 +514,12 @@ public async Task ThrottleAwareStrategy_SkipsThrottledConnection() --- +## Changelog + +| Date | Change | +|------|--------| +| 2026-03-18 | Added Surfaces frontmatter, Changelog per spec governance | + ## Roadmap - Adaptive pool sizing based on server load signals diff --git a/specs/connection-references.md b/specs/connection-references.md new file mode 100644 index 000000000..ef14aa5f0 --- /dev/null +++ b/specs/connection-references.md @@ -0,0 +1,169 @@ +# Connection References + +**Status:** Implemented +**Last Updated:** 2026-03-18 +**Code:** [src/PPDS.Dataverse/Services/IConnectionReferenceService.cs](../src/PPDS.Dataverse/Services/IConnectionReferenceService.cs) | [src/PPDS.Extension/src/panels/ConnectionReferencesPanel.ts](../src/PPDS.Extension/src/panels/ConnectionReferencesPanel.ts) | [src/PPDS.Cli/Tui/Screens/ConnectionReferencesScreen.cs](../src/PPDS.Cli/Tui/Screens/ConnectionReferencesScreen.cs) +**Surfaces:** CLI, TUI, Extension, MCP + +--- + +## Overview + +View connection references in an environment, see which Power Platform connections they're bound to (with health status via Connections API), understand flow dependencies, and detect orphaned references. Key tool for connection health monitoring and deployment troubleshooting. + +### Goals + +- **Connection health visibility:** Surface connection status from Power Platform Connections API alongside Dataverse connection reference data +- **Dependency analysis:** Show which cloud flows depend on each connection reference +- **Orphan detection:** Identify connection references with no bound connection or flows with missing references +- **Multi-surface consistency:** Same data and operations via VS Code, TUI, MCP, and CLI (Constitution A1, A2) + +### Non-Goals + +- Connection Picker / rebinding (deferred — tracked as separate issue) +- Connection creation or management (Power Platform admin operations) +- Connector registration or custom connector management + +--- + +## Architecture + +``` +┌──────────────────────────────────────────────────────┐ +│ UI Surfaces (thin) │ +│ ┌───────────┐ ┌─────────┐ ┌──────┐ ┌─────────┐ │ +│ │ VS Code │ │ TUI │ │ MCP │ │ CLI │ │ +│ │ Webview │ │ Screen │ │ Tool │ │ Command │ │ +│ └─────┬─────┘ └────┬────┘ └──┬───┘ └────┬────┘ │ +│ JSON-RPC Direct Direct Direct │ +│ ┌─────▼──────────────▼──────────▼────────────▼────┐ │ +│ │ IConnectionReferenceService │ │ +│ │ ListAsync, GetAsync, GetFlowsUsingAsync, │ │ +│ │ AnalyzeAsync │ │ +│ ├──────────────────────────────────────────────────┤ │ +│ │ IConnectionService (Power Platform API) │ │ +│ │ IFlowService │ │ +│ └─────────────────────┬───────────────────────────┘ │ +│ │ │ +│ ┌─────────────────────▼───────────────────────────┐ │ +│ │ IDataverseConnectionPool + IPowerPlatformToken │ │ +│ └──────────────────────────────────────────────────┘ │ +└──────────────────────────────────────────────────────┘ +``` + +### Components + +| Component | Responsibility | +|-----------|----------------| +| `IConnectionReferenceService` | Domain service — ListAsync, GetAsync, GetFlowsUsingAsync, AnalyzeAsync | +| `IConnectionService` | Power Platform API integration — connection status via `service.powerapps.com` | +| `IFlowService` | Flow dependency resolution — ListAsync | +| `ConnectionReferencesPanel.ts` | VS Code webview panel — table, detail pane, solution filter, orphan analysis | +| `ConnectionReferencesScreen.cs` | TUI screen — data table, detail/analyze dialogs, hotkeys | +| `ConnectionReferencesListTool.cs` | MCP tool — structured listing with connection status | +| `ConnectionReferencesGetTool.cs` | MCP tool — full detail with flows and connection info | +| `ConnectionReferencesAnalyzeTool.cs` | MCP tool — orphan detection | + +### Dependencies + +- Depends on: [connection-pooling.md](./connection-pooling.md) for pooled Dataverse clients +- Depends on: [architecture.md](./architecture.md) for Application Service boundary +- Uses patterns from: [CONSTITUTION.md](./CONSTITUTION.md) — A1, A2, D1, D4 + +--- + +## Specification + +### Service Layer + +**RPC Endpoints:** + +| Method | Request | Response | +|--------|---------|----------| +| `connectionReferences/list` | `{ solutionId?, environmentUrl? }` | `{ references: ConnectionReferenceInfo[] }` | +| `connectionReferences/get` | `{ logicalName, environmentUrl? }` | `{ reference: ConnectionReferenceDetail, flows: FlowInfo[], connection?: ConnectionInfo }` | +| `connectionReferences/analyze` | `{ environmentUrl? }` | `{ orphanedReferences: [], orphanedFlows: [] }` | +| `connections/list` | `{ connectorId?, environmentUrl? }` | `{ connections: ConnectionInfo[] }` | + +**ConnectionReferenceInfo fields:** logicalName, displayName, connectorId, connectionId, isManaged, modifiedOn, connectionStatus (Connected/Error/Unknown), connectorDisplayName + +**Note:** `connections/list` uses Power Platform API (`service.powerapps.com`), not Dataverse SDK. Requires `IPowerPlatformTokenProvider`. SPN auth has limited access — graceful degradation: panel loads without connection status, shows "N/A" instead. + +### Extension Surface + +- **viewType:** `ppds.connectionReferences` +- **Layout:** Three-zone with virtual table + detail pane on row click +- **Table columns:** Display Name, Logical Name, Connector, Connection Status (color-coded), Managed, Modified On +- **Default sort:** logicalName ascending +- **Solution filter:** Dropdown in toolbar (persisted across sessions) +- **Detail view (on row click):** Connection details (status, owner, shared/personal), dependent flows (clickable), orphan indicator +- **Actions:** Refresh (Ctrl+R), solution filter, analyze (orphan detection), Open in Maker, environment picker with theming +- **Graceful degradation:** SPN auth — connection status column shows "N/A", panel otherwise fully functional + +### TUI Surface + +- **Class:** `ConnectionReferencesScreen` extending `TuiScreenBase` +- **Hotkeys:** Ctrl+R (refresh), Enter (detail dialog), Ctrl+A (analyze), Ctrl+F (filter by solution), Ctrl+O (open in Maker) +- **Dialogs:** `ConnectionReferenceDetailDialog`, `OrphanAnalysisDialog` + +### MCP Surface + +| Tool | Input | Output | +|------|-------|--------| +| `ppds_connection_references_list` | `{ solutionId?, environmentUrl? }` | List with connection status | +| `ppds_connection_references_get` | `{ logicalName }` | Full detail with flows and connection info | +| `ppds_connection_references_analyze` | `{ environmentUrl? }` | Orphaned references and flows | + +--- + +## Acceptance Criteria + +| ID | Criterion | Test | Status | +|----|-----------|------|--------| +| AC-CR-01 | `connectionReferences/list` returns references with optional solution filter | TBD | ✅ | +| AC-CR-02 | Connection status populated from Power Platform Connections API | TBD | ✅ | +| AC-CR-03 | Graceful degradation when Connections API unavailable (SPN) — panel loads, status shows N/A | TBD | ✅ | +| AC-CR-04 | VS Code panel displays table with color-coded connection status | TBD | ✅ | +| AC-CR-05 | Solution filter persists selection to globalState with panel-specific key | TBD | 🔲 | +| AC-CR-06 | Row click shows detail pane with connection info, dependent flows, orphan status | TBD | ✅ | +| AC-CR-07 | Analyze action identifies orphaned references and flows | TBD | ✅ | +| AC-CR-08 | TUI ConnectionReferencesScreen displays same data and operations | TBD | ✅ | +| AC-CR-09 | MCP ppds_connection_references_analyze returns structured orphan analysis | TBD | ✅ | +| AC-CR-10 | Open in Maker uses `buildMakerUrl()` (not inline URL construction) | TBD | 🔲 | +| AC-CR-11 | Panel unit tests cover message handling, environment switching, data loading | TBD | 🔲 | + +--- + +## Design Decisions + +### Why Connections API (BAPI) for connection status? + +**Context:** Panel could show only Dataverse data (connection reference records without connection health). + +**Decision:** Include connection status from Power Platform API and detail pane with dependent flows. Defer connection picker/rebinding. + +**Rationale:** `IConnectionService` already exists. Without status, panel shows raw connector IDs — low value. SPN graceful degradation keeps panel functional for all auth methods. Connection picker deferred because the legacy extension never shipped it and deployment settings sync covers the automated use case. + +### Why orphan detection as a dedicated analyze action? + +**Context:** Could surface orphan indicators inline on every load, or provide a separate analysis action. + +**Decision:** Separate `analyze` action that identifies orphaned references (no connection) and orphaned flows (missing connection reference). + +**Rationale:** Orphan analysis requires cross-referencing flows, connections, and connection references — heavier than a standard list. Making it explicit avoids slow default loads while providing high-value diagnostic capability. + +--- + +## Changelog + +| Date | Change | +|------|--------| +| 2026-03-18 | Extracted from panel-parity.md per SL1 | + +--- + +## Related Specs + +- [architecture.md](./architecture.md) — Application Service boundary +- [connection-pooling.md](./connection-pooling.md) — Dataverse connection management +- [CONSTITUTION.md](./CONSTITUTION.md) — Governing principles diff --git a/specs/data-explorer.md b/specs/data-explorer.md new file mode 100644 index 000000000..4bad1a7a9 --- /dev/null +++ b/specs/data-explorer.md @@ -0,0 +1,518 @@ +# Data Explorer + +**Status:** Draft +**Last Updated:** 2026-03-18 +**Code:** [src/PPDS.Extension/src/panels/QueryPanel.ts](../src/PPDS.Extension/src/panels/QueryPanel.ts) +**Surfaces:** Extension, TUI, MCP + +--- + +## Overview + +The Data Explorer provides interactive query execution and result browsing across all PPDS surfaces (VS Code webview, TUI, MCP). This spec covers the VS Code panel enhancements including Monaco editor integration and rectangular selection/copy UX. + +The VS Code panel (`QueryPanel.ts`) hosts a webview that currently uses a raw `