Add keepalive loop metrics emission#98
Conversation
…otstrap-pr-for-issue-#93
There was a problem hiding this comment.
Pull request overview
This PR adds comprehensive metrics collection and emission capabilities to the agents keepalive loop workflow. The changes enable tracking of workflow execution time, iteration counts, task completion progress, and loop termination reasons for later analysis and aggregation.
- Captures workflow start timestamp at the beginning of the evaluate job
- Computes and emits structured keepalive metrics including duration, iteration count, task progress, and stop reason
- Persists metrics as NDJSON artifacts with 30-day retention for downstream aggregation
Comments suppressed due to low confidence (1)
.github/workflows/agents-keepalive-loop.yml:127
- The regex pattern allows negative numbers for tasks_total and tasks_unchecked, which doesn't make logical sense for task counts. Consider using the pattern ^[0-9]+$ (without the optional minus sign) to match only non-negative integers, consistent with the START_TS validation pattern.
run-codex:
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Automated Status SummaryHead SHA: dc5bac0
Coverage Overview
Coverage Trend
Updated automatically; will refresh on subsequent CI/Docker completions. Keepalive checklistScopeNo scope information available Tasks
Acceptance criteria
|
|
Keepalive loop status for PR #98
|
* chore(codex): bootstrap PR for issue #93 * Add metrics reporting to autofix loop (#97) * Add keepalive metrics summary and artifact (#98) * Add verifier workflow metrics emission * Add weekly agent metrics aggregation * docs: add agents-weekly-metrics.yml to workflow inventory and docs - Add to EXPECTED_NAMES in test_workflow_naming.py - Add to WORKFLOWS.md workflow catalog - Add to WORKFLOW_SYSTEM.md workflow table - Fix executable bit on aggregate_agent_metrics.py --------- Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
* feat(verifier): add acceptance_count output for metrics Cherry-picked from PR #100. Counts acceptance criteria checkboxes in the PR body and linked issues, outputting acceptance_count for use in verifier metrics. This was the unique remaining value from PR #100 - the rest of its changes (PRs #97, #98) were already merged. Closes #100 * Update .github/scripts/agents_verifier_context.js Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> --------- Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Summary
Testing
Codex Task