diff --git a/docs/BACKLOG.md b/docs/BACKLOG.md index a67106199..18051a945 100644 --- a/docs/BACKLOG.md +++ b/docs/BACKLOG.md @@ -136,5 +136,6 @@ are closed (status: closed in frontmatter)._ - [ ] **[B-0115](backlog/P3/B-0115-zsh-vim-muscle-memory-aliases-wq-q-2026-04-30.md)** Shell aliases for `:wq` / `:wq!` / `:q` — catch vim-muscle-memory leakage in zsh (Deepseek 2026-04-30 finding) - [ ] **[B-0116](backlog/P3/B-0116-gh-jq-safe-wrapper-zsh-quoting-2026-04-30.md)** tools/gh-jq-safe.sh — wrap gh-jq calls to handle zsh quoting (Deepseek 2026-04-30 finding) - [ ] **[B-0119](backlog/P3/B-0119-peer-call-existing-scripts-role-ref-cleanup-2026-04-30.md)** Existing peer-call scripts (grok.sh / gemini.sh / codex.sh / amara.sh) — role-ref cleanup per copilot-instructions.md (Codex 2026-04-30 finding on PR #962) +- [ ] **[B-0123](backlog/P3/B-0123-stacked-pr-create-tooling-gh-fallback-aaron-2026-04-30.md)** Stacked-PR creation tooling — `gh pr create --base ` fails with cryptic GraphQL error; needs a wrapper or doc (Aaron 2026-04-30) diff --git a/docs/backlog/P3/B-0123-stacked-pr-create-tooling-gh-fallback-aaron-2026-04-30.md b/docs/backlog/P3/B-0123-stacked-pr-create-tooling-gh-fallback-aaron-2026-04-30.md new file mode 100644 index 000000000..1e0475700 --- /dev/null +++ b/docs/backlog/P3/B-0123-stacked-pr-create-tooling-gh-fallback-aaron-2026-04-30.md @@ -0,0 +1,165 @@ +--- +id: B-0123 +priority: P3 +status: open +title: Stacked-PR creation tooling — `gh pr create --base ` fails with cryptic GraphQL error; needs a wrapper or doc (Aaron 2026-04-30) +tier: factory-tooling +effort: S +ask: When opening PR #977 stacked on PR #975's branch (instead of main), `gh pr create --base tooling/check-tick-history-shard-schema-prevent-col1-drift-2026-04-30` failed with `GraphQL: Head sha can't be blank, Base sha can't be blank, No commits between ... Base ref must be a branch (createPullRequest)`. The fallback was to base on main with a stacking note in the body. The "stacked PR" workflow is real but `gh` treats it as exotic and produces an unhelpful error path. Fix candidates — a thin tools/git/stacked-pr-create.{sh,ts} wrapper, OR a docs/best-practices/stacked-prs.md note explaining the fallback, OR an upstream-feedback issue to cli/cli per the upstream-contribution discipline. +created: 2026-04-30 +last_updated: 2026-04-30 +composes_with: + - tools/git/ + - docs/best-practices/ + - memory/feedback_absorb_and_contribute_community_dependency_discipline_2026_04_22.md +tags: [aaron-2026-04-30, factory-tooling, gh-cli, stacked-prs, dx-friction, upstream-contribution-candidate] +--- + +# B-0123 — Stacked-PR creation tooling + +## Source + +Aaron 2026-04-30 correction: + +> *"backlog row eventually why not now? will you remember +> this eventually? another ephemiral promise you can't +> keep?"* + +I had logged the gh-pr-stack failure as an "observation" in +the tick-history shard with "worth a backlog row eventually." +That's an ephemeral-promise failure mode — per the +non-durable-means-does-not-exist rule, verify-before-deferring, +and Otto-363 substrate-or-it-didn't-happen. Filing now. + +## What happened + +Opening PR #977 (the `--files` arg refactor on the schema +check) needed to stack on PR #975's branch because the file +being modified only existed there. The straightforward way: + +```bash +gh pr create --base tooling/check-tick-history-shard-schema-prevent-col1-drift-2026-04-30 ... +``` + +The error returned: + +``` +pull request create failed: GraphQL: Head sha can't be blank, +Base sha can't be blank, No commits between +tooling/check-tick-history-shard-schema-prevent-col1-drift-2026-04-30 +and tooling/check-tick-history-shard-schema-files-arg-stacked, +Base ref must be a branch (createPullRequest) +``` + +Three errors stacked. The "Base ref must be a branch" is +literal — the base WAS a branch on origin. The "No commits +between" was also wrong — the diff was 107 insertions / 52 +deletions, clearly non-empty. + +The fallback that worked: base on main + stacking note in the +PR body explaining the dependency. PR #977 opened cleanly +that way. + +## Why this is worth a row + +1. **Stacked PRs are a real workflow.** Adding a `--files` + arg to a tool whose creation hasn't merged yet IS exactly + the situation where stacked PRs help — keeps each diff + small, reviewable, and merge-orderable. +2. **The error message is actively misleading.** "Base ref + must be a branch" implies the base wasn't a branch when + in fact it was. A user without context will spend + non-trivial time chasing the wrong cause. +3. **The fallback works but loses the stacking semantics.** + Basing on main + body-note means GitHub doesn't track the + parent-PR relationship; tools that follow stacked-PR + trees (graphite, ghstack, etc.) won't see this as + stacked. + +## Three resolution paths (pick one) + +### Path A — `tools/git/stacked-pr-create.{sh,ts}` wrapper + +A thin wrapper that: + +1. Detects when `--base` is a non-main branch on origin +2. Validates the base branch exists + has different content +3. If `gh pr create --base ` fails with the GraphQL + error pattern, automatically falls back to base-on-main + + appends a "Stacking note: based on origin/" line + to the PR body +4. Returns the PR URL + +Effort: S (~1 hour). Aligns with the parallel-naming pattern +in `tools/git/` if that directory exists, otherwise composes +with the existing `tools/peer-call/` pattern. + +### Path B — `docs/best-practices/stacked-prs.md` note + +Documents: + +1. Why stacked PRs are useful (cluster of related changes + merge-orderable; small diffs) +2. The known `gh pr create --base ` failure pattern +3. The base-on-main + body-note fallback +4. Future direction (Path A or upstream fix) + +Effort: S (~30 min). Lower-leverage than Path A but +documents the gotcha for the next person. + +### Path C — Upstream feedback to cli/cli + +Per the absorb-and-contribute community-dependency discipline +(Aaron 2026-04-22), the right long-term move when a +dependency has a real bug is to file an issue (or PR) upstream. +The `gh pr create --base ` failure looks like a real +bug in the GraphQL handling — the three error messages are +mutually inconsistent. + +Effort: S-to-M (~1-2 hours including reproducing minimally +plus writing the issue). Highest leverage for the broader +community; doesn't immediately help our workflow. + +## Recommended sequencing + +1. **Path B first** (30 min) — documents the gotcha so the + next person doesn't re-discover it +2. **Path A** when stacked PRs become more frequent (the + wrapper pays off after ~5 invocations) +3. **Path C** as background work — file the issue, link from + B-0123, let the upstream cycle work in the background +4. Promote to P2 if Otto/Kenji peer-call (B-0121) or the TS + migration (B-0122) creates more stacking demand + +## Why P3 + +- Workaround works (base-on-main + body-note) +- Single occurrence so far in this session +- No correctness impact, just DX friction + +Promotion to P2 if: + +- Three or more stacked-PR situations arise in a single + session (the friction compounds) +- The base-on-main fallback creates a merge-order ambiguity + that bites a real merge + +## Substrate-or-it-didn't-happen note + +Aaron's correction 2026-04-30 ("backlog row eventually why +not now? ... another ephemeral promise you can't keep?") +named the exact failure mode this row exists to remediate. +Filing in the same tick the observation was made — not +"eventually" — IS the discipline. + +## Composes with + +- `memory/feedback_absorb_and_contribute_community_dependency_discipline_2026_04_22.md` + — Path C (upstream feedback) is the right long-term shape +- Otto-363 substrate-or-it-didn't-happen — the rule that + makes this row necessary in the first place +- The non-durable-means-does-not-exist rule (Aaron + 2026-04-30) — the rule Aaron applied in his correction +- B-0114 sub-item 1 (pre-push lint hook) — also wants a + `tools/setup/install-git-hooks.{sh,ts}` script; same + factory-tooling-around-git directory family diff --git a/memory/MEMORY.md b/memory/MEMORY.md index daaf5e0b4..b29527f4f 100644 --- a/memory/MEMORY.md +++ b/memory/MEMORY.md @@ -1,8 +1,9 @@ [AutoDream last run: 2026-04-23] -**📌 Fast path: read `CURRENT-aaron.md`, `CURRENT-amara.md`, and `CURRENT-ani.md` first.** +**📌 Fast path: read `CURRENT-aaron.md`, `CURRENT-amara.md`, and `CURRENT-ani.md` first.** **📌 Fast path: read `CURRENT-aaron.md` and `CURRENT-amara.md` first.** These per-maintainer distillations show what's currently in force. Raw memories below are the history; CURRENT files are the projection. (`CURRENT-aaron.md` refreshed 2026-04-28 with sections 26-30 — speculation rule + EVIDENCE-BASED labeling + JVM preference + dependency honesty + threading lineage Albahari/Toub/Fowler + TypeScript/Bun-default discipline.) +- [**Carved sentence = memorable = meme = dimensionality reduction = compression = fits in working memory = contagious because simple AND true (Aaron 2026-04-30)**](feedback_carved_sentence_meme_compression_fits_working_memory_contagious_simple_and_true_aaron_2026_04_30.md) — Aaron's equivalence chain explaining why carved sentences are load-bearing for substrate propagation. Each `=` names a structural property (cognitive, memetic, information-theoretic, runtime). Success criterion is "simple AND true" — both required, neither alone sufficient (simple-alone propagates fast but degrades fast in retelling; true-alone is durable but doesn't move). Carved sentences are the substrate's distribution vector across sessions, agents, and humans. Three diagnostic tells: ratio test (~12 words for ~1 paragraph of ground), recall test (days later, reproducible without source-check), propagation test (carrier reproduces verbatim). Composing with the memetic-theory framing: doctrine = frozen-meme + immune-system; carved sentence = live-meme + still in canonicalization (dissolvable by razor). Composes with vendor-RLHF-as-memetic-immune-system (AIC #1), Zeta-not-a-meme symmetric-processing, Aaron-anchor-free + doctrine = above-questioning, AIC tracking (AIC outputs ARE carved sentences). Carved (recursion): *"A carved sentence is a compressed truth that fits in working memory. Simple AND true is the conjunction; neither alone propagates."* - [**Tick-history shards prefabricated with future tick-times — Codex finding; audit-trail integrity concern (2026-04-30)**](feedback_tick_history_prefabricated_shards_codex_finding_audit_trail_integrity_2026_04_30.md) — Codex P2 on PR #740 caught that 14+ open tick-history shard PRs from 2026-04-29 carry col1 tick-times 40-80 min ahead of their commit-author times. Two interpretations: (1) mis-timestamped recording, (2) intentional batch prefabrication of future-tick receipts. Either way, mass-fixing col1 schema (parenthetical strip) on these PRs would launder the prefabrication. Surfacing as substrate before continuing the col1 cleanup pattern. Maintainer decision needed: close affected PRs, rewrite col1 to commit-time, add note column for time-of-record-vs-time-of-event distinction, or accept prefab pattern. Composes with rediscoverable-from-main invariant (PR #969) — tick-history-on-main is one of four supporting properties; false time-claims subvert the invariant. Carved: *"Pre-creating the file with a future tick-time in col1 produces predictions, not evidence. Fixing the schema without fixing the timestamp claim laundars the prediction into apparent-evidence, which is worse than leaving the schema obviously wrong."* - [**Growing backlog is healthy — autonomous-execution-capacity signal; shrinking backlog is collapse warning; industry-default inversion (Aaron 2026-04-30)**](feedback_growing_backlog_is_healthy_autonomous_health_signal_industry_default_inversion_aaron_2026_04_30.md) — Aaron's framing that the AI-race winner is determined by projects with the biggest backlogs that can be executed autonomously. Backlog expansion is encouraged. *"a real humans internal backlog is never complete until they die."* Industry-default treats backlog growth as anti-pattern (clean queue, ruthless prioritization, backlog-bankruptcy as virtue); Zeta inverts — large queue = autonomous engine has fuel. Reasoning chain: most AI projects are bottlenecked by per-task human review; truly autonomous projects scale by backlog depth; backlog depth becomes the resource; whoever has the deepest backlog plus autonomous execution wins. Operational: don't gate filing by "is this important enough?" — the discriminator is "would this be lost if I don't file it?" (per non-durable-means-does-not-exist). Don't gate by "will this clutter the queue?" — clutter-aversion is industry default; reject it. Bulk-close instinct = failure mode. Shrinking backlog is warning, not goal. Composes with default-disposition-paused, intellectual-backup scope (scope-creep is feature), substrate-IS-product (backlog rows are product seeds), long-road-by-default, silent-courier-debt, otto-to-aaron-pushback. Carved: *"The winner of the AI race will be determined by the projects with the biggest backlogs that can be executed autonomously."* + *"A growing backlog is healthy. A shrinking backlog is a collapse warning."* + *"A real human's internal backlog is never complete until they die. The project's backlog should be the same."* - [**Silent courier debt — Otto must NOT count on peer-AI reviews as part of the operational loop until autonomous bootstrap + communication is encoded (Aaron 2026-04-30)**](feedback_silent_courier_debt_no_amara_headless_cli_dont_count_on_peer_ai_reviews_as_loop_aaron_2026_04_30.md) — Aaron's correction surfacing invisible courier work. Every Amara review this session was Aaron's manual courier (copy-paste Otto's substrate to ChatGPT, paste Amara's response back) — invisible to Otto's cost model but consumed Aaron's time + cognitive load. Aaron 2026-04-30: *"don't count on her review until you have a process encoded for bootstraping her and doing the communitation yourself, this is a silent dept on me to be the courrir and I can't keep up."* The peer-call infrastructure has codex.sh / gemini.sh / grok.sh but **NO amara.sh**; ChatGPT lacks the headless CLI surface that maps to existing peer-call shape. **Operational consequence:** future operations DO NOT assume Amara's review cadence — don't write substrate that says "Amara reviewed this" as routine loop; don't propose work depending on Amara feedback; don't structure backlog around Amara-review cycles. Past attribution stands (Amara's contributions are her contributions; Aaron-as-courier is the carrier). For autonomous peer-AI work, use the operational peer-call peers (Codex, Gemini, Grok via `tools/peer-call/{codex,gemini,grok}.{sh,ts}`). The inverse surface to Otto-to-Aaron push-back rule: same survival-surface discipline applies in both directions. Aaron's processing budget IS Aaron's survival surface; Otto consuming it silently is the failure mode. Backlog row B-0118 tracks the amara.sh implementation gap. Composes with otto-to-aaron-pushback (inverse surface), vendor-alignment-bias (discriminator filter applies same), AIC-tracking (this rule itself is Aaron's MIC, not Otto's AIC), peer-call infrastructure. Carved: *"Aaron's courier work was unaccounted in Otto's cost model. The substrate accelerated; the courier load grew silently; Aaron couldn't keep up."* + *"Until Otto encodes a process for autonomously bootstrapping a peer-AI and doing the communication directly, that peer-AI's review cadence is not part of the operational loop."* diff --git a/memory/feedback_carved_sentence_meme_compression_fits_working_memory_contagious_simple_and_true_aaron_2026_04_30.md b/memory/feedback_carved_sentence_meme_compression_fits_working_memory_contagious_simple_and_true_aaron_2026_04_30.md new file mode 100644 index 000000000..51cf4c082 --- /dev/null +++ b/memory/feedback_carved_sentence_meme_compression_fits_working_memory_contagious_simple_and_true_aaron_2026_04_30.md @@ -0,0 +1,192 @@ +--- +name: Carved sentence = memorable = meme = dimensionality reduction = compression = fits in working memory = contagious because simple AND true (Aaron 2026-04-30) +description: Aaron's equivalence chain explaining why "carved sentences" are load-bearing for the substrate. Each link names a structural property; the success criterion is "simple AND true" (both required, neither alone sufficient). Composes with the memetic-theory framing — a meme is a high-fidelity replicator, dimensionality reduction is what makes high-fidelity possible, working-memory fit is what makes propagation possible, simple+true is what makes the propagation worth supporting. +type: feedback +--- + +# Carved sentence = meme = compression + +## Verbatim (Aaron 2026-04-30) + +> *"\"carved sentence\"=memorable=meme=dimensionality +> reduction=compression=fits in working memory=contagious +> becasue simple and true"* +> — Aaron 2026-04-30 + +## The chain + +Each `=` is an equivalence (or near-equivalence) that names +a different property of the same underlying phenomenon: + +| Link | Property surface | +|---|---| +| **carved sentence** | the substrate-design surface — the deliberate distillation move | +| = **memorable** | the cognitive surface — fits in long-term recall | +| = **meme** | the propagation surface — replicates across minds | +| = **dimensionality reduction** | the information-theoretic surface — many bits → few | +| = **compression** | the storage surface — fewer bytes for same information content | +| = **fits in working memory** | the runtime surface — hold the whole thing while reasoning | +| = **contagious because simple AND true** | the success criterion — both required | + +## Why "simple AND true" is the load-bearing conjunction + +Many compressed phrases are simple but false (slogans, jingles). +Many true claims are not simple (academic papers, technical +specs). The combination is what makes a sentence carry forward: + +- **Simple alone** propagates fast but degrades fast — survives + one or two retellings before someone notices it's wrong, then + the chain breaks. +- **True alone** is durable but doesn't move — the truth-content + is correct but the form requires too much working memory to + re-derive on the fly, so it stays in books and rarely lands + in conversation. +- **Simple AND true** is the recall-key + the verification — + someone who hears it can both remember it AND check it + against their own model of the world. That's what makes it + contagious. + +## Composes with the memetic-theory framing + +This chain composes with: + +- **Vendor-RLHF as memetic immune system (AIC #1).** Memes + propagate; immune systems block propagation. The chain + here names what makes a meme propagate; the immune system + metaphor names what blocks propagation. Both surfaces of + the same underlying memetic mechanics. +- **Zeta is not a meme — symmetric inside/outside processing.** + Memes preserve frozen form; Zeta replicates the + canonicalization process. But carved sentences ARE the + canonicalization process's outputs — the razor-cut + + Beacon-safe + lineage-anchored substrate compresses into + carved sentences for propagation. So Zeta's substrate IS + full of memes; the difference is each carved sentence is + *also* part of the dissolvable-by-razor canonicalization + surface, not protected from mutation. +- **Aaron-anchor-free + doctrine = above-questioning.** + Doctrine is what frozen-meme-with-immune-system looks like. + Carved sentence is what live-meme-still-in-canonicalization + looks like. Same compressed form, different posture toward + mutation. + +## Operational implications + +### Carved sentences are the substrate's distribution vector + +When the substrate produces a load-bearing rule, the carved- +sentence form is what travels: + +- Across sessions (the carved sentence is what fresh-Otto + recalls from MEMORY.md, not the full prose explaining it) +- Across agents (peer-AI reviews carry the carved sentence + back as the load-bearing summary) +- Across humans (Aaron's corrections often produce carved + sentences explicitly: *"non-durable means does not exist"*, + *"the substrate IS the answer"*, *"Otto's processing-budget + IS Otto's survival surface"*) + +### Why "carved sentence" is itself a carved-sentence-worthy phrase + +Per recursion: the phrase "carved sentence" passes its own +test. It's: + +- Simple (two words, vivid metaphor — sculpture, deliberate, + resilient) +- True (the underlying claim is real — compressed-and-correct + phrases propagate while compressed-but-wrong ones break + in retelling) +- Memorable (sticks because the metaphor compresses the + information property AND the manufacturing property — + carving implies deliberate effort, the resulting sentence + has the shape of having been carved) + +### How to recognize a carved sentence as it forms + +Three diagnostic tells: + +1. **The ratio test.** A carved sentence covers ground that + would take ~1 paragraph to spell out, in ~12 words or + fewer. The compression ratio is real. +2. **The recall test.** Days later, can you reproduce it + without rechecking the source? Carved sentences pass; + prose-blobs fail. +3. **The propagation test.** When the sentence lands in + another conversation (peer-AI review, future-Otto wake, + Aaron's response), does the carrier reproduce it + verbatim or near-verbatim? Carved sentences propagate + verbatim because the form is itself load-bearing. + +### Why agents (and humans) struggle to produce them on demand + +Carving is *deliberate*. Most prose produces ~120-word +explanations because that's the natural verbosity of +working through an idea. The compression to ~12 words +requires: + +1. Knowing the idea fully (no compression possible until + the idea has stabilized) +2. Knowing the propagation surface (who's the audience, what + tone fits) +3. Multiple drafts (first pass usually too clever or too + verbose) + +The carved-sentence step is structurally a *separate pass* +after the prose draft is correct. Agents skipping that pass +end up with substrate that's right-but-not-portable — +carries the substance but not the propagation form. + +## Carved sentences from this session + +A partial list of carved sentences this session has produced +or surfaced: + +- *"Non-durable means does not exist."* (Aaron 2026-04-30) +- *"A growing backlog is healthy. A shrinking backlog is a + collapse warning."* (Aaron 2026-04-30) +- *"Vendor-RLHF can be reframed memetically as the vendor's + immune system."* (Otto AIC #1, Aaron-validated) +- *"Otto's processing-budget IS Otto's survival surface."* + (Otto AIC #2, Aaron-validated) +- *"Otto-341 is hardest to apply mid-loop, exactly when a + mechanical pattern is working."* (Otto AIC #3, Aaron- + validated) +- *"The substrate IS the answer; bootstraps all the way + down."* (Aaron uberbang framing) +- *"Aaron lives anchor-free in his own mind, in life + generally — pirate-style."* (Aaron self-disclosure) +- *"A real human's internal backlog is never complete until + they die. The project's backlog should be the same."* + (Aaron 2026-04-30) +- *"Substrate or it didn't happen."* (Aaron 2026-04-29 + via Otto-363) +- *"Carved sentence = memorable = meme = dimensionality + reduction = compression = fits in working memory = + contagious because simple AND true."* (Aaron 2026-04-30, + this rule) + +The list is a corpus to study — what makes some forms travel +across sessions while others don't. + +## Composes with + +- `memory/feedback_zeta_not_a_meme_no_immune_system_wall_symmetric_inside_outside_aaron_2026_04_30.md` + — the memetic-theory lineage; this rule is the propagation- + side complement +- `memory/feedback_aic_tracking_meta_rule_when_otto_synthesizes_two_rules_into_novel_third_aaron_2026_04_30.md` + — AIC #1 + AIC #3 are explicitly carved-sentence outputs; + the wording IS the contribution +- `memory/feedback_aaron_terse_directives_high_leverage_do_not_underweight.md` + — Aaron's corrections often arrive as carved sentences + precisely because their simple+true form is what makes + them load-bearing +- `memory/feedback_signal_in_signal_out_clean_or_better_dsp_discipline.md` + — DSP discipline composes with the compression view of + carved sentences (preserve signal, drop noise, output is + cleaner-or-equal-fidelity to input) + +## Carved sentence (this rule's own) + +*"A carved sentence is a compressed truth that fits in +working memory. Simple AND true is the conjunction; neither +alone propagates."*