Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
@@ -1,9 +1,12 @@
{
"name": "workflows-env",
"image": "mcr.microsoft.com/devcontainers/python:3.11",
"onCreateCommand": "sudo apt-get update && sudo apt-get install -y python3-venv curl tar && sudo mkdir -p /usr/local/bin && curl -sSL 'https://github.com/rhysd/actionlint/releases/download/v1.7.3/actionlint_1.7.3_linux_amd64.tar.gz' | sudo tar -xz -C /usr/local/bin actionlint && sudo chmod +x /usr/local/bin/actionlint",
"onCreateCommand": "sudo apt-get update && sudo apt-get install -y python3-venv curl tar && sudo mkdir -p /usr/local/bin && curl -sSL 'https://github.com/rhysd/actionlint/releases/download/v1.7.3/actionlint_1.7.3_linux_amd64.tar.gz' | sudo tar -xz -C /usr/local/bin actionlint && sudo chmod +x /usr/local/bin/actionlint && sudo rm -f /usr/local/py-utils/bin/black /usr/local/py-utils/bin/ruff /usr/local/py-utils/bin/isort /usr/local/py-utils/bin/mypy",
Copy link

Copilot AI Jan 19, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The onCreateCommand is extremely long (over 400 characters) which makes it difficult to read and maintain. Consider breaking this into multiple commands or using a separate setup script for better readability.

Suggested change
"onCreateCommand": "sudo apt-get update && sudo apt-get install -y python3-venv curl tar && sudo mkdir -p /usr/local/bin && curl -sSL 'https://github.com/rhysd/actionlint/releases/download/v1.7.3/actionlint_1.7.3_linux_amd64.tar.gz' | sudo tar -xz -C /usr/local/bin actionlint && sudo chmod +x /usr/local/bin/actionlint && sudo rm -f /usr/local/py-utils/bin/black /usr/local/py-utils/bin/ruff /usr/local/py-utils/bin/isort /usr/local/py-utils/bin/mypy",
"onCreateCommand": [
"sudo apt-get update",
"sudo apt-get install -y python3-venv curl tar",
"sudo mkdir -p /usr/local/bin",
"curl -sSL 'https://github.com/rhysd/actionlint/releases/download/v1.7.3/actionlint_1.7.3_linux_amd64.tar.gz' | sudo tar -xz -C /usr/local/bin actionlint",
"sudo chmod +x /usr/local/bin/actionlint",
"sudo rm -f /usr/local/py-utils/bin/black /usr/local/py-utils/bin/ruff /usr/local/py-utils/bin/isort /usr/local/py-utils/bin/mypy"
],

Copilot uses AI. Check for mistakes.
"postCreateCommand": "pip install -e '.[dev]' && pre-commit install --install-hooks --hook-type pre-commit --hook-type pre-push",
"postStartCommand": "pre-commit install --install-hooks --hook-type pre-commit --hook-type pre-push",
"containerEnv": {
"PATH": "/home/vscode/.local/bin:${containerEnv:PATH}"
},
"features": {
"ghcr.io/devcontainers/features/github-cli:1": {},
"ghcr.io/devcontainers/features/node:1": {
Expand Down
9 changes: 9 additions & 0 deletions .github/sync-manifest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -51,9 +51,18 @@ workflows:
- source: .github/workflows/agents-keepalive-loop.yml
description: "Keepalive loop - continues agent work until tasks complete"

- source: .github/workflows/agents-71-codex-belt-dispatcher.yml
description: "Codex belt dispatcher - selects issues and creates codex/issue-N branches for agent work"

- source: .github/workflows/agents-72-codex-belt-worker.yml
description: "Codex belt worker - executes Codex agent on issues with full prompt and context"

- source: .github/workflows/agents-72-codex-belt-worker-dispatch.yml
description: "Codex belt worker dispatch wrapper - allows workflow_dispatch for the worker"

- source: .github/workflows/agents-73-codex-belt-conveyor.yml
description: "Codex belt conveyor - orchestrates belt worker execution and handles completion"

- source: .github/workflows/agents-autofix-loop.yml
description: "Autofix loop - dispatches Codex when autofix can't fix Gate failures"

Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/health-72-template-sync.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,12 +16,12 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v6

- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: '3.11'

- name: Validate template sync
run: |
if ! python scripts/validate_template_sync.py; then
Expand Down
43 changes: 43 additions & 0 deletions .github/workflows/health-73-template-completeness.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
name: Health 73 Template Completeness

# Validates that workflows intended for consumer repos are in the template and manifest.
# This catches the case where a workflow is added to Workflows but not synced.

on:
push:
branches: [main]
paths:
- '.github/workflows/*.yml'
- 'templates/consumer-repo/.github/workflows/*.yml'
- '.github/sync-manifest.yml'
- 'scripts/validate_template_completeness.py'
pull_request:
paths:
- '.github/workflows/*.yml'
- 'templates/consumer-repo/.github/workflows/*.yml'
- '.github/sync-manifest.yml'
- 'scripts/validate_template_completeness.py'
workflow_dispatch:

permissions:
contents: read

jobs:
validate:
name: Check template completeness
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6

- name: Setup Python
uses: actions/setup-python@v6
with:
python-version: '3.12'

- name: Install dependencies
run: pip install pyyaml

- name: Validate template completeness
run: |
python scripts/validate_template_completeness.py --strict
1 change: 1 addition & 0 deletions docs/ci/WORKFLOWS.md
Original file line number Diff line number Diff line change
Expand Up @@ -171,6 +171,7 @@ Scheduled health jobs keep the automation ecosystem aligned:
* [`health-70-validate-sync-manifest.yml`](../../.github/workflows/health-70-validate-sync-manifest.yml) validates that sync-manifest.yml is complete - ensures all sync-able files are declared (PR, push).
* [`health-71-sync-health-check.yml`](../../.github/workflows/health-71-sync-health-check.yml) monitors sync workflow health daily - creates issues if all recent runs failed or sync is stale (daily schedule, manual dispatch).
* [`health-72-template-sync.yml`](../../.github/workflows/health-72-template-sync.yml) validates that template files are in sync with source scripts - fails if `.github/scripts/` changes but `templates/consumer-repo/` isn't updated (PR, push on script changes).
* [`health-73-template-completeness.yml`](../../.github/workflows/health-73-template-completeness.yml) validates that consumer-intended workflows exist in the template directory and sync manifest - prevents workflows from being added to .github/workflows/ without being synced to consumer repos (PR, push on workflow/template changes).
* [`health-75-api-rate-diagnostic.yml`](../../.github/workflows/health-75-api-rate-diagnostic.yml) monitors API rate limit utilization across PATs and GitHub Apps - alerts when usage exceeds 85% and provides load balancing analysis (scheduled every 4 hours, manual dispatch).
* [`maint-68-sync-consumer-repos.yml`](../../.github/workflows/maint-68-sync-consumer-repos.yml) pushes workflow template updates to registered consumer repos (release, template push, manual dispatch).
* [`maint-69-sync-integration-repo.yml`](../../.github/workflows/maint-69-sync-integration-repo.yml) syncs integration-repo templates to Workflows-Integration-Tests repository (template push, manual dispatch with dry-run support).
Expand Down
1 change: 1 addition & 0 deletions docs/ci/WORKFLOW_SYSTEM.md
Original file line number Diff line number Diff line change
Expand Up @@ -696,6 +696,7 @@ Keep this table handy when you are triaging automation: it confirms which workfl
| **Health 70 Validate Sync Manifest** (`health-70-validate-sync-manifest.yml`, maintenance bucket) | `pull_request`, `push` | Validate that sync-manifest.yml includes all sync-able files. Fails PRs that add workflows/prompts/scripts without updating manifest. | ⚪ Required on PRs | [Manifest validation runs](https://github.com/stranske/Workflows/actions/workflows/health-70-validate-sync-manifest.yml) |
| **Health 71 Sync Health Check** (`health-71-sync-health-check.yml`, maintenance bucket) | `schedule` (daily), `workflow_dispatch` | Monitor sync workflow health and create issues when all recent runs failed or sync is stale. | ⚪ Scheduled/manual | [Sync health check runs](https://github.com/stranske/Workflows/actions/workflows/health-71-sync-health-check.yml) |
| **Health 72 Template Sync** (`health-72-template-sync.yml`, maintenance bucket) | `pull_request`, `push` (`.github/scripts/`, `templates/`) | Validate that template files are in sync with source scripts. Fails if `.github/scripts/*.js` changes but `templates/consumer-repo/` isn't updated. | ⚪ Required on PRs | [Template sync validation runs](https://github.com/stranske/Workflows/actions/workflows/health-72-template-sync.yml) |
| **Health 73 Template Completeness** (`health-73-template-completeness.yml`, maintenance bucket) | `pull_request`, `push` (`.github/workflows/`, `templates/`, manifest) | Validate that consumer-intended workflows exist in template and manifest. Prevents workflows added to .github/workflows/ without being synced to consumer repos. | ⚪ Required on PRs | [Template completeness runs](https://github.com/stranske/Workflows/actions/workflows/health-73-template-completeness.yml) |
| **Health 75 API Rate Diagnostic** (`health-75-api-rate-diagnostic.yml`, maintenance bucket) | `schedule` (every 4 hours), `workflow_dispatch` | Monitor API rate limit utilization across GITHUB_TOKEN, PATs, and GitHub Apps. Alerts when usage exceeds 85%, tracks consumer repo workflow activity, and provides load balancing analysis. | ⚪ Scheduled/manual | [API rate diagnostic runs](https://github.com/stranske/Workflows/actions/workflows/health-75-api-rate-diagnostic.yml) |
| **Maint 68 Sync Consumer Repos** (`maint-68-sync-consumer-repos.yml`, maintenance bucket) | `release`, `push` (templates), `workflow_dispatch` | Push workflow template updates to registered consumer repositories. Creates PRs in consumer repos when templates change. | ⚪ Automatic/manual | [Consumer sync runs](https://github.com/stranske/Workflows/actions/workflows/maint-68-sync-consumer-repos.yml) |
| **Maint 69 Sync Integration Repo** (`maint-69-sync-integration-repo.yml`, maintenance bucket) | `push` (templates), `workflow_dispatch` | Sync integration-repo templates to Workflows-Integration-Tests repository. Resolves drift detected by Health 67. Supports dry-run mode. | ⚪ Automatic/manual | [Integration sync runs](https://github.com/stranske/Workflows/actions/workflows/maint-69-sync-integration-repo.yml) |
Expand Down
236 changes: 236 additions & 0 deletions manager-database-pr327-fix.patch
Original file line number Diff line number Diff line change
@@ -0,0 +1,236 @@
diff --git a/adapters/base.py b/adapters/base.py
index 93e845b..f4a4a64 100644
--- a/adapters/base.py
+++ b/adapters/base.py
@@ -10,9 +10,11 @@ from importlib import import_module
from typing import Any, Protocol

try:
- import psycopg
+ import psycopg as _psycopg
except ImportError: # pragma: no cover - optional dependency
- psycopg = None
+ _psycopg = None # type: ignore[assignment]
+
+psycopg = _psycopg


class AdapterProtocol(Protocol):
@@ -77,7 +79,8 @@ async def tracked_call(source: str, endpoint: str, *, db_path: str | None = None
status = getattr(resp, "status_code", 0)
size = len(getattr(resp, "content", b""))
conn = connect_db(db_path)
- conn.execute("""CREATE TABLE IF NOT EXISTS api_usage (
+ conn.execute(
+ """CREATE TABLE IF NOT EXISTS api_usage (
id INTEGER PRIMARY KEY AUTOINCREMENT,
ts TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
source TEXT,
@@ -86,7 +89,8 @@ async def tracked_call(source: str, endpoint: str, *, db_path: str | None = None
bytes INT,
latency_ms INT,
cost_usd REAL
- )""")
+ )"""
+ )
if isinstance(conn, sqlite3.Connection):
conn.execute(
"CREATE VIEW IF NOT EXISTS monthly_usage AS "
diff --git a/adapters/edgar.py b/adapters/edgar.py
index 4d4ba61..1d80aec 100644
--- a/adapters/edgar.py
+++ b/adapters/edgar.py
@@ -45,6 +45,10 @@ async def _request_with_retry(
extra={"url": url, "attempt": attempt, "max_retries": max_retries},
)
await asyncio.sleep(wait)
+ # Unreachable but satisfies type checker
+ raise RuntimeError("Unreachable") # pragma: no cover
+ # Unreachable but satisfies type checker
+ raise RuntimeError("Unreachable") # pragma: no cover


async def list_new_filings(cik: str, since: str) -> list[dict[str, str]]:
diff --git a/api/chat.py b/api/chat.py
index e2be31d..0c81d91 100644
--- a/api/chat.py
+++ b/api/chat.py
@@ -96,12 +96,7 @@ def chat(
q: str = Query(
...,
description="User question",
- examples={
- "basic": {
- "summary": "Holdings question",
- "value": "What is the latest holdings update?",
- }
- },
+ examples=["What is the latest holdings update?"],
)
):
"""Return a naive answer built from stored documents."""
diff --git a/api/managers.py b/api/managers.py
index 2a19661..5128f82 100644
--- a/api/managers.py
+++ b/api/managers.py
@@ -88,19 +88,23 @@ def _ensure_manager_table(conn) -> None:
"""Create the managers table if it does not exist."""
# Use dialect-specific schema to keep SQLite and Postgres aligned.
if isinstance(conn, sqlite3.Connection):
- conn.execute("""CREATE TABLE IF NOT EXISTS managers (
+ conn.execute(
+ """CREATE TABLE IF NOT EXISTS managers (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
email TEXT NOT NULL,
department TEXT NOT NULL
- )""")
+ )"""
+ )
else:
- conn.execute("""CREATE TABLE IF NOT EXISTS managers (
+ conn.execute(
+ """CREATE TABLE IF NOT EXISTS managers (
id bigserial PRIMARY KEY,
name text NOT NULL,
email text NOT NULL,
department text NOT NULL
- )""")
+ )"""
+ )


def _insert_manager(conn, payload: ManagerCreate) -> int:
@@ -111,7 +115,11 @@ def _insert_manager(conn, payload: ManagerCreate) -> int:
(payload.name, payload.email, payload.department),
)
conn.commit()
- return int(cursor.lastrowid)
+ return (
+ int(cursor.lastrowid)
+ if cursor.lastrowid is not None
+ else 0 if cursor.lastrowid is not None else 0
+ )
cursor = conn.execute(
"INSERT INTO managers(name, email, department) VALUES (%s, %s, %s) RETURNING id",
(payload.name, payload.email, payload.department),
diff --git a/embeddings.py b/embeddings.py
index e75e3f8..2b7f76e 100644
--- a/embeddings.py
+++ b/embeddings.py
@@ -51,22 +51,26 @@ def store_document(text: str, db_path: str | None = None) -> None:
if register_vector:
register_vector(conn)
conn.execute("CREATE EXTENSION IF NOT EXISTS vector")
- conn.execute("""CREATE TABLE IF NOT EXISTS documents (
+ conn.execute(
+ """CREATE TABLE IF NOT EXISTS documents (
id SERIAL PRIMARY KEY,
content TEXT,
embedding vector(384)
- )""")
+ )"""
+ )
emb = Vector(embed_text(text)) if register_vector else embed_text(text)
conn.execute(
"INSERT INTO documents(content, embedding) VALUES (%s,%s)",
(text, emb),
)
else:
- conn.execute("""CREATE TABLE IF NOT EXISTS documents (
+ conn.execute(
+ """CREATE TABLE IF NOT EXISTS documents (
id INTEGER PRIMARY KEY AUTOINCREMENT,
content TEXT,
embedding TEXT
- )""")
+ )"""
+ )
emb = json.dumps(embed_text(text))
conn.execute(
"INSERT INTO documents(content, embedding) VALUES (?, ?)",
diff --git a/etl/daily_diff_flow.py b/etl/daily_diff_flow.py
index 2cca646..918a0b9 100644
--- a/etl/daily_diff_flow.py
+++ b/etl/daily_diff_flow.py
@@ -20,12 +20,14 @@ def compute(cik: str, date: str, db_path: str) -> None:
try:
additions, exits = diff_holdings(cik, db_path)
conn = connect_db(db_path)
- conn.execute("""CREATE TABLE IF NOT EXISTS daily_diff (
+ conn.execute(
+ """CREATE TABLE IF NOT EXISTS daily_diff (
date TEXT,
cik TEXT,
cusip TEXT,
change TEXT
- )""")
+ )"""
+ )
for cusip in additions:
conn.execute(
"INSERT INTO daily_diff VALUES (?,?,?,?)",
diff --git a/etl/edgar_flow.py b/etl/edgar_flow.py
index 6bd964d..6860cb0 100644
--- a/etl/edgar_flow.py
+++ b/etl/edgar_flow.py
@@ -38,7 +38,8 @@ logger = logging.getLogger(__name__)
async def fetch_and_store(cik: str, since: str):
filings = await ADAPTER.list_new_filings(cik, since)
conn = connect_db(DB_PATH)
- conn.execute("""
+ conn.execute(
+ """
CREATE TABLE IF NOT EXISTS holdings (
cik TEXT,
accession TEXT,
@@ -48,7 +49,8 @@ async def fetch_and_store(cik: str, since: str):
value INTEGER,
sshPrnamt INTEGER
)
- """)
+ """
+ )
results = []
for filing in filings:
raw = await ADAPTER.download(filing)
diff --git a/etl/logging_setup.py b/etl/logging_setup.py
index d05d4d6..b49c500 100644
--- a/etl/logging_setup.py
+++ b/etl/logging_setup.py
@@ -10,9 +10,11 @@ from typing import Any
import boto3

try: # pragma: no cover - optional dependency for structured logs
- from pythonjsonlogger import jsonlogger
+ from pythonjsonlogger import jsonlogger as _jsonlogger
except ImportError: # pragma: no cover
- jsonlogger = None
+ _jsonlogger = None # type: ignore[assignment]
+
+jsonlogger = _jsonlogger

_LOGGING_CONFIGURED = False

diff --git a/tests/test_open_issues.py b/tests/test_open_issues.py
index 2531bfc..f68beae 100644
--- a/tests/test_open_issues.py
+++ b/tests/test_open_issues.py
@@ -4,13 +4,15 @@ from scripts.open_issues import parse_tasks


def test_parse_tasks(tmp_path):
- md = textwrap.dedent("""
+ md = textwrap.dedent(
+ """
### 4.1 Stage 0 — Bootstrap
1. Create docker-compose
2. Create schema
### 4.2 Stage 1 — Proof
* Implement adapter
- """)
+ """
+ )
file = tmp_path / "a.md"
file.write_text(md)
tasks = parse_tasks(str(file))
Comment on lines +1 to +236
Copy link

Copilot AI Jan 19, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This patch file appears to be unrelated to the PR's stated purpose of adding belt workflows to the template. The file contains fixes for a "manager-database" project (PR #327) which is not mentioned in the PR description. This should either be removed from this PR or explained in the PR description if it's intentionally included for a specific reason.

Copilot uses AI. Check for mistakes.
Loading
Loading