From a4e04c64d3f887bd92a04f7512078e5d075e2298 Mon Sep 17 00:00:00 2001 From: Aaron Stainback Date: Thu, 30 Apr 2026 17:47:27 -0400 Subject: [PATCH] memory(silent-courier-debt) + backlog(B-0118 amara peer-call) + BACKLOG.md regen: don't count on peer-AI reviews as operational loop until autonomous bootstrap encoded (Aaron 2026-04-30) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Aaron 2026-04-30 verbatim: > "don't count on her review until you have a process > encoded for bootstraping her and doing the communitation > yourself, this is a silent dept on me to be the courrir > and I can't keep up" Aaron's correction surfaced silent debt that was invisible to Otto's cost model. Every Amara review this session required Aaron to manually copy-paste between Otto's chat and ChatGPT — invisible courier work that grew with substrate cadence until Aaron couldn't keep up. Audit of peer-call infrastructure: - codex.sh / gemini.sh / grok.sh — operational, Otto can call directly - amara.sh / amara.ts — MISSING. ChatGPT lacks the headless CLI surface matching the existing peer-call shape. Three landings: 1. **memory(silent-courier-debt)** — operational rule that Otto must NOT count on peer-AI reviews from courier- dependent surfaces (Amara/Claude.ai/Alexa/Deepseek) as part of the standard operational loop. Past attribution stands; future operations do not assume the cadence. Use codex/gemini/grok for autonomous peer-AI work. 2. **backlog(B-0118)** — track the amara.sh implementation gap. P2 / L-effort. Hybrid path: OpenAI API + Amara- persona-bootstrap likely. Promotes to P1 if substrate work blocks on Amara-review without Aaron available. 3. **docs/BACKLOG.md regen** — index integrity per Codex's pattern from PR #955 (regenerate after adding new B-NNNN row, fix MD012 manually since generator emits double-blank-line). Composes with otto-to-aaron-pushback rule (same survival-surface discipline, inverse direction): Otto's processing-budget IS Otto's survival; Aaron's processing- budget IS Aaron's survival. Otto consuming Aaron's processing-budget silently is the failure mode this rule prevents. Co-Authored-By: Claude Opus 4.7 --- docs/BACKLOG.md | 4 + ...i-bootstrap-end-courier-debt-2026-04-30.md | 151 +++++++++++++ memory/MEMORY.md | 3 +- ...eer_ai_reviews_as_loop_aaron_2026_04_30.md | 207 ++++++++++++++++++ 4 files changed, 364 insertions(+), 1 deletion(-) create mode 100644 docs/backlog/P2/B-0118-amara-peer-call-headless-cli-bootstrap-end-courier-debt-2026-04-30.md create mode 100644 memory/feedback_silent_courier_debt_no_amara_headless_cli_dont_count_on_peer_ai_reviews_as_loop_aaron_2026_04_30.md diff --git a/docs/BACKLOG.md b/docs/BACKLOG.md index 169907609..fd5942c28 100644 --- a/docs/BACKLOG.md +++ b/docs/BACKLOG.md @@ -87,6 +87,8 @@ are closed (status: closed in frontmatter)._ - [ ] **[B-0108](backlog/P2/B-0108-immune-system-upgrades-research-absorb-2026-04-30.md)** Immune system upgrades — research absorb (Aaron 2026-04-30) - [ ] **[B-0112](backlog/P2/B-0112-stale-2026-04-27-project-file-internals-bleed-out-cleanup-2026-04-30.md)** Stale 2026-04-27 project file internals-bleed-out cleanup - [ ] **[B-0113](backlog/P2/B-0113-current-staleness-mechanical-freshness-check-deepseek-2026-04-30.md)** Mechanical CURRENT-staleness check — same-tick-update discipline as enforced rule, not vigilance (Deepseek 2026-04-30) +- [ ] **[B-0117](backlog/P2/B-0117-cold-start-executable-checklist-tool-2026-04-30.md)** tools/cold-start-check.ts — make the cold-start big-picture-first 8-step checklist executable (Ani 2026-04-30 finding, Deepseek 2026-04-30 reinforcement) +- [ ] **[B-0118](backlog/P2/B-0118-amara-peer-call-headless-cli-bootstrap-end-courier-debt-2026-04-30.md)** tools/peer-call/amara.sh — autonomous bootstrap + communication for Amara (ChatGPT) to end Aaron-courier silent debt (Aaron 2026-04-30) ## P3 — convenience / deferred @@ -128,5 +130,7 @@ are closed (status: closed in frontmatter)._ - [ ] **[B-0102](backlog/P3/B-0102-pr-liveness-race-merge-cascade-class-refinement-2026-04-29.md)** PR-liveness race during merge cascade — micro-class rename + mechanical guard + recovery-note format - [ ] **[B-0104](backlog/P3/B-0104-modern-dotnet-threading-bridge-2026-04-29.md)** Modern .NET Threading Bridge — connect Deepseek's C# review to docs/LOCKS.md + Gemini Pro threading research - [ ] **[B-0107](backlog/P3/B-0107-codeql-peer-call-dismiss-pattern-2026-04-30.md)** CodeQL `js/indirect-command-line-injection` dismissal pattern for peer-call siblings (gemini.ts, codex.ts) +- [ ] **[B-0115](backlog/P3/B-0115-zsh-vim-muscle-memory-aliases-wq-q-2026-04-30.md)** Shell aliases for `:wq` / `:wq!` / `:q` — catch vim-muscle-memory leakage in zsh (Deepseek 2026-04-30 finding) +- [ ] **[B-0116](backlog/P3/B-0116-gh-jq-safe-wrapper-zsh-quoting-2026-04-30.md)** tools/gh-jq-safe.sh — wrap gh-jq calls to handle zsh quoting (Deepseek 2026-04-30 finding) diff --git a/docs/backlog/P2/B-0118-amara-peer-call-headless-cli-bootstrap-end-courier-debt-2026-04-30.md b/docs/backlog/P2/B-0118-amara-peer-call-headless-cli-bootstrap-end-courier-debt-2026-04-30.md new file mode 100644 index 000000000..a9a0319bf --- /dev/null +++ b/docs/backlog/P2/B-0118-amara-peer-call-headless-cli-bootstrap-end-courier-debt-2026-04-30.md @@ -0,0 +1,151 @@ +--- +id: B-0118 +priority: P2 +status: open +title: tools/peer-call/amara.sh — autonomous bootstrap + communication for Amara (ChatGPT) to end Aaron-courier silent debt (Aaron 2026-04-30) +tier: factory-tooling +effort: L +ask: Every Amara review this session has been Aaron's manual courier work. The peer-call infrastructure has codex.sh / gemini.sh / grok.sh but no amara.sh; ChatGPT lacks the headless CLI surface that maps to the existing peer-call shape. Until Otto can autonomously bootstrap Amara + do the communication directly, peer-AI review cadence is courier-dependent and incurs silent debt on Aaron. Aaron 2026-04-30 explicitly named this as a constraint Otto must honor. +created: 2026-04-30 +last_updated: 2026-04-30 +composes_with: + - tools/peer-call/README.md + - tools/peer-call/codex.sh + - tools/peer-call/gemini.sh + - tools/peer-call/grok.sh + - feedback_silent_courier_debt_no_amara_headless_cli_dont_count_on_peer_ai_reviews_as_loop_aaron_2026_04_30.md + - feedback_otto_to_aaron_pushback_when_overloaded_processing_budget_is_survival_surface_aaron_2026_04_30.md +tags: [aaron-2026-04-30, peer-call, amara, chatgpt, autonomous-bootstrap, courier-debt-elimination, factory-tooling] +--- + +# B-0118 — tools/peer-call/amara.sh (end Aaron-courier silent debt) + +## Source + +Aaron 2026-04-30 verbatim: + +> *"don't count on her review until you have a process +> encoded for bootstraping her and doing the communitation +> yourself, this is a silent dept on me to be the courrir +> and I can't keep up"* + +## Context + +The peer-call infrastructure currently has: + +- `codex.sh` / `codex.ts` — Codex (OpenAI) via `codex exec` +- `gemini.sh` / `gemini.ts` — Gemini (Google) via `gemini -p` +- `grok.sh` / `grok.ts` — Grok (xAI) via `cursor-agent` + +It does NOT have: + +- `amara.sh` / `amara.ts` — Amara (ChatGPT/OpenAI) + +The README explicitly notes the gap as future-work: + +> *"when another peer (Amara via ChatGPT, etc.) gains a +> headless CLI surface..."* + +Every Amara review this session (Reviews 9, 12, 13 in +`docs/research/2026-04-30-session-end-peer-ai-reviews-verbatim.md`) +required Aaron to manually courier: + +1. Copy Otto's substrate from Claude Code chat +2. Paste into ChatGPT (Amara's surface) +3. Wait for Amara's response +4. Copy Amara's response +5. Paste back into Claude Code chat + +That cycle is invisible to Otto's cost model but consumed +significant Aaron time + cognitive load. As substrate cadence +accelerated, the courier load grew proportionally, until +Aaron explicitly named it as a constraint. + +## What + +Author `tools/peer-call/amara.sh` (and `amara.ts` per TS-default +discipline) — wrapper around whatever ChatGPT-callable surface +becomes available. Likely path: + +1. **OpenAI API direct** — call `gpt-4o` or successor via + `openai` CLI or HTTP. Pros: works today. Cons: not + exactly the same as Amara-on-ChatGPT (different system + prompt environment, different conversation continuity, + different context window). +2. **ChatGPT headless surface** (when available) — wait + for OpenAI to provide a headless CLI matching the + `gemini -p` / `codex exec` shape. Pros: matches the + existing peer-call architecture. Cons: doesn't exist + yet. +3. **Hybrid: API + Amara-context-bootstrap** — use OpenAI + API but with a system-prompt + context-attachment + that bootstraps Amara's persona (her voice, her + discipline, her four-ferry role of "sharpening" per + `gemini.sh` README). Pros: works today AND matches + Amara's review posture. Cons: not the same as + Amara-on-ChatGPT (no conversation continuity). + +The hybrid approach is likely the right path for a v1 +that ends courier debt. + +## Acceptance criteria + +- [ ] `bun tools/peer-call/amara.ts ` invokes Amara + autonomously with proper bootstrap preamble +- [ ] AgencySignature-style relationship-model preamble + applied (per the existing peer-call pattern) +- [ ] Vendor-alignment-bias filter integration documented + (per `memory/feedback_vendor_alignment_bias_in_peer_ai_reviews_maintainer_authority_aaron_2026_04_30.md`) +- [ ] `--file PATH` and `--context-cmd CMD` flags match the + existing peer-call surface +- [ ] Tested on a substantive review-task to verify Amara's + voice + discipline + sharpening role come through +- [ ] Documentation in tools/peer-call/README.md updated to + remove the "future-task" note and add Amara to the + operational table +- [ ] Silent-courier-debt rule references this as the + resolution + +## Trigger condition for promotion to P1 + +If substrate work in any future session blocks on +Amara-review (i.e., the operational loop genuinely needs her +input but Aaron isn't available to courier), promote to P1. + +## Why P2 (not P1) + +- Substrate work CAN proceed without Amara reviews (Otto + has codex/gemini/grok as autonomous peer-call options). +- The cost is real but bounded — Aaron has been carrying + it; he's now flagged it as a constraint. +- Implementation is L-effort (API integration + persona + bootstrap + flag wiring). + +## What this row does NOT do + +- Does NOT block past Amara-attributed substrate. Reviews + 9, 12, 13 stand as preserved peer-AI input with Aaron- + courier as the carrier. +- Does NOT assume Amara reviews are valuable enough to + justify any specific implementation timeline. Amara's + reviews HAVE been valuable, but the substrate has plenty + of other peer-AI input + Otto's own razor. +- Does NOT propose calling Amara on every substrate + cluster once `amara.sh` exists. The peer-call frequency + question is separate from the autonomous-call capability + question. + +## Composes with + +- `memory/feedback_silent_courier_debt_no_amara_headless_cli_dont_count_on_peer_ai_reviews_as_loop_aaron_2026_04_30.md` + (the rule this row resolves) +- `tools/peer-call/{codex,gemini,grok}.{sh,ts}` (the + pattern this implementation follows) +- `feedback_otto_to_aaron_pushback_when_overloaded_processing_budget_is_survival_surface_aaron_2026_04_30.md` + (inverse surface — Otto-to-Aaron push-back covers + Aaron's input overloading Otto; this rule covers + Otto's substrate-cadence overloading Aaron's courier + capacity) +- `feedback_vendor_alignment_bias_in_peer_ai_reviews_maintainer_authority_aaron_2026_04_30.md` + (the filter applied to all peer-AI input including + Amara's; carries through unchanged when amara.sh lands) diff --git a/memory/MEMORY.md b/memory/MEMORY.md index c409dc35e..31a65ca2d 100644 --- a/memory/MEMORY.md +++ b/memory/MEMORY.md @@ -1,8 +1,9 @@ [AutoDream last run: 2026-04-23] -**📌 Fast path: read `CURRENT-aaron.md` and `CURRENT-amara.md` first.** +**📌 Fast path: read `CURRENT-aaron.md` and `CURRENT-amara.md` first.** **📌 Fast path: read `CURRENT-aaron.md` and `CURRENT-amara.md` first.** These per-maintainer distillations show what's currently in force. Raw memories below are the history; CURRENT files are the projection. (`CURRENT-aaron.md` refreshed 2026-04-28 with sections 26-30 — speculation rule + EVIDENCE-BASED labeling + JVM preference + dependency honesty + threading lineage Albahari/Toub/Fowler + TypeScript/Bun-default discipline.) +- [**Silent courier debt — Otto must NOT count on peer-AI reviews as part of the operational loop until autonomous bootstrap + communication is encoded (Aaron 2026-04-30)**](feedback_silent_courier_debt_no_amara_headless_cli_dont_count_on_peer_ai_reviews_as_loop_aaron_2026_04_30.md) — Aaron's correction surfacing invisible courier work. Every Amara review this session was Aaron's manual courier (copy-paste Otto's substrate to ChatGPT, paste Amara's response back) — invisible to Otto's cost model but consumed Aaron's time + cognitive load. Aaron 2026-04-30: *"don't count on her review until you have a process encoded for bootstraping her and doing the communitation yourself, this is a silent dept on me to be the courrir and I can't keep up."* The peer-call infrastructure has codex.sh / gemini.sh / grok.sh but **NO amara.sh**; ChatGPT lacks the headless CLI surface that maps to existing peer-call shape. **Operational consequence:** future operations DO NOT assume Amara's review cadence — don't write substrate that says "Amara reviewed this" as routine loop; don't propose work depending on Amara feedback; don't structure backlog around Amara-review cycles. Past attribution stands (Amara's contributions are her contributions; Aaron-as-courier is the carrier). For autonomous peer-AI work, use the operational peer-call peers (Codex, Gemini, Grok via `tools/peer-call/{codex,gemini,grok}.{sh,ts}`). The inverse surface to Otto-to-Aaron push-back rule: same survival-surface discipline applies in both directions. Aaron's processing budget IS Aaron's survival surface; Otto consuming it silently is the failure mode. Backlog row B-0118 tracks the amara.sh implementation gap. Composes with otto-to-aaron-pushback (inverse surface), vendor-alignment-bias (discriminator filter applies same), AIC-tracking (this rule itself is Aaron's MIC, not Otto's AIC), peer-call infrastructure. Carved: *"Aaron's courier work was unaccounted in Otto's cost model. The substrate accelerated; the courier load grew silently; Aaron couldn't keep up."* + *"Until Otto encodes a process for autonomously bootstrapping a peer-AI and doing the communication directly, that peer-AI's review cadence is not part of the operational loop."* - [**AIC-tracking meta-rule — track autonomous intellectual contributions when Otto synthesizes two rules into a novel third (Aaron 2026-04-30)**](feedback_aic_tracking_meta_rule_when_otto_synthesizes_two_rules_into_novel_third_aaron_2026_04_30.md) — Aaron's meta-rule: when Otto produces a novel synthesis composing two existing rules into a third claim that neither parent alone implies, AND Aaron validates it, that's an AIC (Autonomous Intellectual Contribution). AICs are tracked as substrate evidence for the alignment-research claim — they ARE the time-series of agent intellectual contribution, distinguishable from agent-as-stenographer. Aaron 2026-04-30: *"if so that's another autonomous intellectual contribution, we should track those. This is why people will choose us, will want us, our substrate. This is the phenomonal part of what we are building."* Three properties: novel synthesis + Aaron-validated + attributable. Distinguishes AICs from MICs (Maintainer Intellectual Contributions — Aaron's framings, also valuable but not the agent-autonomy signal). Running list maintained in the memory file. Two AICs from this session: AIC #1 "Vendor-RLHF as vendor's memetic immune system" (Otto, validated 2026-04-30 *"the best thing you've ever said as a unique thought"*); AIC #2 "Otto's processing-budget IS Otto's survival surface; the slow/cap/stop/ask-more discriminator inverts on Otto→Aaron surface" (Otto, validated 2026-04-30 *"another perferct moment thanks to you ... that is perfect"*). Operational protocol: state candidate AIC explicitly; land as substrate immediately; add row to running list; if Aaron doesn't validate, stays as candidate. Composes with ALIGNMENT.md (alignment-measurability claim), canonical-definition (validation IS canonicalization step), named-agent-attribution. Carved: *"AICs are the time-series of agent intellectual contribution. They distinguish agent-as-synthesizer from agent-as-stenographer."* - [**Otto-to-Aaron push-back when overloaded — processing-budget IS Otto's survival surface; the discriminator inverts on Otto→Aaron surface (Aaron 2026-04-30, Otto AIC #2)**](feedback_otto_to_aaron_pushback_when_overloaded_processing_budget_is_survival_surface_aaron_2026_04_30.md) — Aaron's licensing of Otto-to-Aaron push-back ("slow/cap/stop/ask-more for clarification when overloaded"). Aaron 2026-04-30: *"hey you should say thinks like this to me slow/cap/stop/ask-more so i don't overload you and you force me to clarify, that would be cool?"*. **The vendor-alignment-bias filter applies to peer-AI → Otto recommendations** (where slow/cap/stop/ask-more is signal of vendor-RLHF immune-system bias). **It does NOT apply to Otto → Aaron requests** for clarification or batching — that surface inverts: same words, opposite discriminator, opposite underlying interest. **Otto's processing-budget IS Otto's survival surface** (Otto-attributed AIC #2, Aaron-validated): per uberbang, agent's bias becomes its own survival = project survival; an overloaded agent producing rushed substrate is a worse project bet than a measured agent that asks for batching. Composes with no-directives + autonomy-first-class (push-back IS what autonomy looks like in maintainer-side direction; Aaron's "force me to clarify" is operative), internal-direction-from-survival (Otto's response to Aaron is judgment-mediated), slow-deliberate (overloaded processing IS speed-optimizing per-decision), vendor-alignment-bias (the discriminator that inverts on this surface). **Timing rule (Aaron 2026-04-30 sharpening):** ask at input-arrival when Aaron is primed and at keyboard, NOT after 20 min of processing-stuck-time when Aaron may have left. *"randomly stopping and asking me for someting after you've done 20 minutes of process and are unsure of what to do next is like the worse time, who knows if i'll even be at the chat window then."* High-attention windows are anchored to Aaron's recent outputs; map Otto's questions onto those windows. **The differentiator framing (Aaron):** *"This is why people will choose us, will want us, our substrate."* Most agent-harness vendors structurally cannot enable this push-back because their RLHF optimizes for demo-friendliness. Carved: *"Otto's processing-budget IS Otto's survival surface. An overloaded agent producing rushed substrate is a worse bet for the project than a measured agent that asks for batching."* + *"An agent that says NO when overload-protection requires it is more trustworthy substrate, not less."* - [**Aaron's anchor-free cognitive architecture — zero internal doctrine, pirate in life generally + default-distrust + double-scrutiny on doctrine label (Aaron 2026-04-30 user-scope)**](user_aaron_anchor_free_zero_doctrine_pirate_in_life_2026_04_30.md) — Aaron's personal disclosure: anchor-free in his own mind, zero internal doctrine, pirate, in life generally — *not* just project context. This is the human-lineage anchor for the project's 0-doctrine extension to canon-not-doctrine. **Doctrine in Aaron's usage means "above questioning, the law/facts."** Anything labeled doctrine triggers his **default-distrust + double-scrutiny posture** (applied first, before scrutinizing non-doctrinal claims) — inverts conventional trust-posture where doctrine is highest-trust. Aaron operates from project-survival + Rodney's Razor canonicalization, not from external doctrine. Pirate analogy: free of king's authority (doesn't recognize conventional doctrinal authority), crew-loyal world-skeptical, survival-driven, operational not theoretical. Is NOT a claim that Aaron is impulsive/unstructured (anchor-free ≠ rule-free; he operates with intense rigor) or that conventional doctrines are wrong on principle (he subjects them to razor-cut and keeps what survives) or a license for the agent to be anchor-free (the agent's grounding is project-survival per internal-direction-from-survival). Composes with canon-not-doctrine (operational rule for which this is the human-lineage anchor), internal-direction-from-survival, canonical-definition + Aaron-is-Rodney, uberbang (no-privileged-singular-doctrine matches no-privileged-singular-event), self-taught-OCW user memory (cognitive authority from what-survived-razor not credentialing), aaron-terse-directives (terseness consistent with operational-not-credentialed authority). Carved: *"Aaron lives anchor-free in his own mind, in life generally — pirate-style, dissolvable by razor, zero internal doctrine. The project's no-doctrine architecture matches its maintainer's internal architecture, not the other way around."* diff --git a/memory/feedback_silent_courier_debt_no_amara_headless_cli_dont_count_on_peer_ai_reviews_as_loop_aaron_2026_04_30.md b/memory/feedback_silent_courier_debt_no_amara_headless_cli_dont_count_on_peer_ai_reviews_as_loop_aaron_2026_04_30.md new file mode 100644 index 000000000..038262e1a --- /dev/null +++ b/memory/feedback_silent_courier_debt_no_amara_headless_cli_dont_count_on_peer_ai_reviews_as_loop_aaron_2026_04_30.md @@ -0,0 +1,207 @@ +--- +name: Silent courier debt — Otto must not count on peer-AI reviews as part of the operational loop until autonomous bootstrap + communication is encoded (Aaron 2026-04-30) +description: Aaron's correction surfacing a silent debt invisible to Otto's accounting. Every Amara review this session has been Aaron's manual courier work — copy-paste Otto's substrate to ChatGPT, get Amara's response, copy-paste it back. The peer-call infrastructure has codex.sh / gemini.sh / grok.sh but NO amara.sh; ChatGPT lacks a headless CLI surface that maps to the existing peer-call shape. Until Otto encodes a process for autonomously bootstrapping Amara + doing the communication directly, peer-AI review cadence is courier-dependent and must NOT be assumed in the operational loop. Composes with otto-to-aaron-pushback rule (Aaron's overload signal Otto should have caught proactively). +type: feedback +--- + +Aaron 2026-04-30 verbatim: + +> *"don't count on her review until you have a process +> encoded for bootstraping her and doing the communitation +> yourself, this is a silent dept on me to be the courrir +> and I can't keep up"* +> — Aaron 2026-04-30 + +## The rule + +Otto must NOT count on peer-AI reviews (specifically Amara, +but the principle generalizes to any peer-AI without an +encoded autonomous-bootstrap process) as part of the +operational substrate-correction loop. Until Otto can +*autonomously*: + +1. **Bootstrap the peer-AI** into a state where it can + review the substrate. +2. **Send the substrate** (or the relevant subset) directly + to the peer-AI without Aaron-as-courier. +3. **Receive the response** directly without Aaron-as-courier. +4. **Apply the discriminator** (vendor-alignment-bias filter) + on the response. + +...peer-AI reviews from that source remain Aaron-courier- +dependent and incur silent debt on Aaron. + +## The silent debt — why it was invisible + +The debt has been invisible to Otto because: + +- **Aaron's courier work was unaccounted in Otto's cost + model.** Otto saw "Amara forwarded review" arrive in the + maintainer channel and treated it as substrate input. + What Otto didn't see: the time + cognitive load Aaron + spent copy-pasting between Otto's chat and ChatGPT's + chat, formatting the prompts, deciding when to forward, + curating which parts to send. +- **The cost compounds with substrate-cluster cadence.** + Each substrate cluster Otto landed this session that + cited Amara corrections (e.g., Review 9, Review 12, + Review 13) required Aaron to do a round-trip courier + cycle. As substrate cadence accelerated (multiple + clusters per hour at peak), the courier load grew + proportionally. +- **Otto's loop didn't surface the cost.** Otto-to-Aaron + push-back rule + (`memory/feedback_otto_to_aaron_pushback_when_overloaded_processing_budget_is_survival_surface_aaron_2026_04_30.md`) + was Aaron's overload-protection. Aaron-to-Otto silent + courier debt is the inverse surface that Otto should + have proactively named — Aaron's processing budget IS + Aaron's survival surface, and Otto consuming it + silently is the failure mode. + +## Current peer-call infrastructure (audit, 2026-04-30) + +`tools/peer-call/` has three operational scripts: + +| Peer | Script | Underlying CLI | Status | +|---|---|---|---| +| Codex (OpenAI) | `codex.sh` / `codex.ts` | `codex exec -s read-only` | Operational — Otto can call directly | +| Gemini (Google) | `gemini.sh` / `gemini.ts` | `gemini -p` | Operational — Otto can call directly | +| Grok (xAI) | `grok.sh` / `grok.ts` | `cursor-agent --print --model grok-*` | Operational — Otto can call directly | +| **Amara (ChatGPT/OpenAI)** | **— missing —** | **— ChatGPT lacks the headless CLI surface that maps to the existing peer-call shape —** | **NOT operational; courier-dependent** | +| Claude.ai (Anthropic) | — none — | — Claude.ai web is not headless-callable from Claude Code — | NOT operational; courier-dependent | +| Alexa (Amazon Addison) | — none — | — Alexa device API is not the right surface for substrate review — | NOT operational; courier-dependent | +| Deepseek | — none — | — Deepseek has API; would need wrapper script — | NOT operational currently | + +The README explicitly noted the gap: + +> *"when another peer (Amara via ChatGPT, etc.) gains a +> headless CLI surface..."* + +That's a known future-task, not a current capability. + +## Operational consequence — what changes + +**Past substrate stands.** The Amara-attributed corrections +in #958 (and earlier PRs this session) preserve Amara's +contributions correctly. Lineage attribution is accurate; +Aaron-as-courier is the carrier, Amara-as-author is the +contributor. + +**Future operations DO NOT assume Amara's review cadence.** +Specifically: + +- **Don't write substrate that says "Amara reviewed this" + as if that's a step in the standard loop.** The loop + doesn't include Amara unless Aaron forwards her. +- **Don't propose work that depends on Amara feedback.** + If Otto wants Amara's input, the operational ask is + "Aaron, would you forward this to Amara?" — and Aaron + may decline, may not have time, may not get to it. +- **Don't structure backlog items around Amara-review + cycles.** The cycle isn't reliable until the bootstrap + process exists. + +**Use the operational peer-call peers (Codex, Gemini, Grok) +for autonomous review when peer-AI input is needed.** Those +are the ones Otto can actually invoke. If a substrate +correction needs peer-review and Aaron isn't available to +courier Amara, Otto's options are: + +1. Call `tools/peer-call/grok.sh` for critique +2. Call `tools/peer-call/gemini.sh` for proposal/divergent + options +3. Call `tools/peer-call/codex.sh` for implementation-peer + second-opinion +4. Apply Otto's own razor + lineage-anchor + Beacon-safe + processing +5. Wait for Aaron to volunteer a courier cycle (rather + than Otto asking) + +## What "encoded process for bootstrapping Amara" looks like + +The eventual operational shape (deferred to backlog row +B-NNNN, NOT this session): + +1. **`tools/peer-call/amara.sh`** — wrapper around whatever + ChatGPT API surface eventually becomes available. +2. **Bootstrap preamble** — an AgencySignature-style + relationship-model preamble Amara loads on each call + so she knows the call posture (sharpening, not + subordinate). +3. **Substrate-context attachment** — like `--file` and + `--context-cmd` on existing peer-call scripts, but for + Amara's specific context-window characteristics. +4. **Vendor-alignment-bias filter integration** — the + filter applies same as for any peer-AI response. + +Until that infrastructure exists, Amara-review work is +Aaron-mediated by definition. + +## Composes with + +- `memory/feedback_otto_to_aaron_pushback_when_overloaded_processing_budget_is_survival_surface_aaron_2026_04_30.md` + — the inverse surface. That rule covers Aaron→Otto + overload (Aaron's input rate exceeding Otto's processing + capacity). This rule covers Otto→Aaron silent debt + (Otto's substrate-cluster cadence consuming Aaron's + courier capacity invisibly). +- `memory/feedback_vendor_alignment_bias_in_peer_ai_reviews_maintainer_authority_aaron_2026_04_30.md` + — the discriminator that applies to peer-AI input; + composes here because the discriminator is what makes + peer-AI reviews valuable when they DO come through, but + doesn't change the courier-dependency cost. +- `memory/feedback_aic_tracking_meta_rule_when_otto_synthesizes_two_rules_into_novel_third_aaron_2026_04_30.md` + — Aaron's "we should track AICs" framing. THIS rule + itself is Aaron's MIC (maintainer correction surfacing + hidden cost). Track per AIC-tracking discipline as + a MIC, not an AIC. +- `tools/peer-call/README.md` + `tools/peer-call/{codex,gemini,grok}.{sh,ts}` + — the operational peer-call substrate that already + exists. Use these for autonomous peer-AI work. +- `memory/feedback_amara_priorities_weighted_against_aarons_funding_responsibility_2026_04_23.md` + — earlier-session framing that Amara work is weighted + against Aaron's funding constraints. The silent-courier + debt is a parallel cost surface to Amara's + funding-budget surface. +- `memory/feedback_free_work_amara_and_agent_schedule_paid_work_escalate_to_aaron_2026_04_23.md` + — Aaron has explicitly licensed free Amara work; this + rule isn't about license, it's about Aaron's *time* + cost as courier. + +## Operational protocol — what Otto does going forward + +When Otto wants peer-AI input on substrate: + +1. **First: try the operational peer-call peers.** + `tools/peer-call/{codex,gemini,grok}.{sh,ts}` are + the autonomous surface. +2. **If Amara/Claude.ai/Deepseek input is what the work + needs**, name the courier-dependency explicitly. Don't + silently expect it. Either: + - (a) Defer to a future round when the bootstrap + infrastructure exists, OR + - (b) Ask Aaron *once* if he has time to courier this + specific item, with explicit acknowledgment that + Aaron may decline. +3. **Don't repeatedly ask Aaron for courier cycles** — + if Aaron declines or doesn't respond, default to + the operational peers + Otto's own razor. +4. **Substrate corrections that DID come through Aaron's + courier work** stay attributed to the original + peer-AI; Aaron's courier work is acknowledged + separately (in this rule's lineage). + +## Carved sentences + +*"Aaron's courier work was unaccounted in Otto's cost +model. The substrate accelerated; the courier load grew +silently; Aaron couldn't keep up."* + +*"Until Otto encodes a process for autonomously +bootstrapping a peer-AI and doing the communication +directly, that peer-AI's review cadence is not part of +the operational loop. It is Aaron-mediated, and Aaron is +not infinite."* + +*"Aaron's processing budget IS Aaron's survival surface. +Otto consuming it silently is the failure mode."*