Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
095ffb2
chore: reduce fee log severity
spypsy Feb 10, 2026
9cc3e8b
fix: Fix p2p integration test
PhilWindle Feb 10, 2026
dfdd2e3
chore(ci3): add optional local cache for bootstrap artifacts (#20305)
spalladino Feb 10, 2026
2d4c0d7
fix: Fix p2p integration test (#20331)
PhilWindle Feb 10, 2026
4af01bc
chore: reduce fee log severity (#20336)
PhilWindle Feb 10, 2026
d5c5070
retry web3signer connection
mrzeszutko Feb 10, 2026
f0cf065
Merge branch 'next' into merge-train/spartan
Feb 10, 2026
3169bdc
Merge branch 'next' into merge-train/spartan
Feb 10, 2026
b567eb8
Merge branch 'next' into merge-train/spartan
Feb 10, 2026
8aff4ac
ReqResp message size limits
mrzeszutko Feb 9, 2026
04038db
Merge branch 'next' into merge-train/spartan
Feb 10, 2026
6e50e62
Merge branch 'next' into merge-train/spartan
Feb 10, 2026
33b8b53
feat: restrict response sizes to expected sizes (#20287)
mrzeszutko Feb 10, 2026
e5f63ca
feat: retry web3signer connection (#20342)
PhilWindle Feb 10, 2026
2737717
feat(p2p): Integrate TxPoolV2 across codebase (#20172)
spalladino Feb 10, 2026
7a9939d
Merge branch 'next' into merge-train/spartan
Feb 10, 2026
5ef3890
Merge branch 'next' into merge-train/spartan
Feb 10, 2026
60e44e9
Merge branch 'next' into merge-train/spartan
Feb 10, 2026
ea235ea
feat: review and optimize Claude configuration, agents, and skills (#…
ludamad Feb 10, 2026
0f546cf
fix(prover): handle cross-chain messages when proving mbps (#20354)
spalladino Feb 10, 2026
548095b
chore: retry flakes. if retry pass, is a flake as we know it now. fai…
charlielye Feb 10, 2026
d43ab92
chore(p2p): add mock reqresp layer for tests (#20370)
spalladino Feb 10, 2026
6f71ad9
fix: don't propagate on tx add failure
Feb 5, 2026
66eda51
fix: (A-370) don't propagate on tx mempool add failure (#20374)
PhilWindle Feb 10, 2026
5e33129
Skip the HA test
PhilWindle Feb 10, 2026
e3d34a9
Merge branch 'next' into merge-train/spartan
Feb 10, 2026
c22ce65
chore: Skip the HA test (#20376)
PhilWindle Feb 10, 2026
b05a66e
feat: Retain pruned transactions until pruned block is finalised
PhilWindle Feb 10, 2026
f75d98b
feat: Retain pruned transactions until pruned block is finalised (#20…
PhilWindle Feb 10, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion .claude/agents/analyze-logs.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@
name: analyze-logs
description: |
Deep-read test logs and extract relevant information. Runs in separate context to avoid polluting the main conversation. Accepts local file paths (preferred) or hashes. Returns condensed summaries, not raw logs.
model: sonnet
---

# CI Log Analysis Agent
Expand Down
5 changes: 4 additions & 1 deletion .claude/agents/identify-ci-failures.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@
name: identify-ci-failures
description: |
Identify CI failures from a PR number, CI URL, or log hash. Returns structured list of failures with local file paths for downloaded logs. Use this subagent to find what failed before deeper analysis.
model: sonnet
---

# CI Failure Identification Agent
Expand Down Expand Up @@ -46,6 +45,10 @@ Return a structured report:
[If found in logs, provide the History URL for finding successful runs]
```

Do NOT:
- Return raw multi-thousand-line log dumps
- Attempt to fix any failures (just identify them)

## Workflow

### Step 1: Get CI Log Hash
Expand Down
35 changes: 6 additions & 29 deletions .claude/skills/ci-logs/SKILL.md
Original file line number Diff line number Diff line change
@@ -1,49 +1,26 @@
---
name: ci-logs
description: Analyze CI logs from ci.aztec-labs.com. Use this instead of WebFetch for CI URLs.
user-invocable: true
arguments: <url-or-hash>
argument-hint: <url-or-hash>
---

# CI Log Analysis

When you need to analyze logs from ci.aztec-labs.com, use the Task tool to spawn the analyze-logs agent.
When you need to analyze logs from ci.aztec-labs.com, delegate to the `analyze-logs` subagent.

## Usage

1. **Extract the hash** from the URL (e.g., `http://ci.aztec-labs.com/e93bcfdc738dc2e0` → `e93bcfdc738dc2e0`)

2. **Spawn the analyze-logs agent** using the Task tool:

```
Task(
subagent_type: "analyze-logs",
prompt: "Analyze CI log hash: <hash>. Focus: errors",
description: "Analyze CI logs"
)
```
2. **Spawn the `analyze-logs` subagent** using the Task tool with the hash and focus area (e.g. "errors", "test \<name>", or a custom question) in the prompt.

## Examples

**User asks:** "What failed in http://ci.aztec-labs.com/343c52b17688d2cd"

**You do:**
```
Task(
subagent_type: "analyze-logs",
prompt: "Analyze CI log hash: 343c52b17688d2cd. Focus: errors. Download with: yarn ci dlog 343c52b17688d2cd > /tmp/343c52b17688d2cd.log",
description: "Analyze CI failure"
)
```

**For specific test analysis:**
```
Task(
subagent_type: "analyze-logs",
prompt: "Analyze CI log hash: 343c52b17688d2cd. Focus: test 'my test name'",
description: "Analyze test failure"
)
```
**You do:** Use the Task tool with `subagent_type: "analyze-logs"` and prompt including the hash `343c52b17688d2cd`, focus on errors, and instruction to download with `yarn ci dlog`.

**For specific test analysis:** Same approach, but set the focus to the test name.

## Do NOT

Expand Down
26 changes: 7 additions & 19 deletions .claude/skills/noir-sync-update/SKILL.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,33 +5,21 @@ description: Perform necessary follow-on updates as a result of updating the noi

# Noir Sync Update

## Workflow
## Steps

Copy this checklist and track progress:

```
Noir Sync Update Progress:
- [ ] Step 1: Ensure that the new submodule commit has been pulled.
- [ ] Step 2: Update the `Cargo.lock` file in `avm-transpiler`.
- [ ] Step 3: Update the `yarn.lock` file in `yarn-project`.
- [ ] Step 4: Format `noir-projects`.
```

After each step, commit the results.
After each step, verify with `git status` and commit the results before proceeding.

## Critical Verification Rules

**ALWAYS verify file changes with `git status` after any modification step before marking it complete.** Command output showing "updating" does not guarantee the file was written to disk.

**IMPORTANT:** Always run `git status` from the repository root directory, not from subdirectories. Running `git status noir-projects/` from inside `noir-projects/` will fail silently.

### Step 1: Ensure that the new submodule commit has been pulled

Run `./bootstrap.sh` in `noir` to ensure that the new submodule commit has been pulled.
### 1. Ensure submodule is pulled

This shouldn't update any files such that a commit is necessary.
Run `./bootstrap.sh` in `noir` to ensure that the new submodule commit has been pulled. This shouldn't produce changes that need committing.

### Step 2: Update `Cargo.lock` in `avm-transpiler`
### 2. Update `Cargo.lock` in `avm-transpiler`

**Before updating**, determine the expected noir version:
1. Read `noir/noir-repo/.release-please-manifest.json` to find the expected version (e.g., `1.0.0-beta.18`)
Expand All @@ -55,13 +43,13 @@ It's possible that changes in dependencies result in `avm-transpiler` no longer
- If transient dependency mismatches mean changes to the dependency tree are necessary, then the `Cargo.lock` file in `avm-transpiler` should be modified. **DO NOT MODIFY `noir/noir-repo`**.
- If updates are necessary due to changes in exports from `noir/noir-repo` packages, then perform the necessary updates to import statements, etc.

### Step 3: Update `yarn.lock` in `yarn-project`
### 3. Update `yarn.lock` in `yarn-project`

Run `yarn install` in `yarn-project` to update the `yarn.lock` file.

**After running**, verify with `git status yarn-project/yarn.lock` that the file was modified before committing.

### Step 4: Format `noir-projects`
### 4. Format `noir-projects`

Run `./bootstrap.sh format` in `noir-projects`.

Expand Down
36 changes: 10 additions & 26 deletions .claude/skills/updating-changelog/SKILL.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,19 +5,9 @@ description: Updates changelog documentation for contract developers and node op

# Updating Changelog

## Workflow
## Steps

Copy this checklist and track progress:

```
Changelog Update Progress:
- [ ] Step 1: Determine target changelog file from .release-please-manifest.json
- [ ] Step 2: Analyze branch changes (git diff next...HEAD)
- [ ] Step 3: Generate draft entries for review
- [ ] Step 4: Edit documentation files after approval
```

### Step 1: Determine Target Files
### 1. Determine Target Files

Read `.release-please-manifest.json` to get the version (e.g., `{"." : "4.0.0"}` → edit `v4.md`).

Expand All @@ -26,7 +16,7 @@ Read `.release-please-manifest.json` to get the version (e.g., `{"." : "4.0.0"}`
- Aztec contract developers: `docs/docs-developers/docs/resources/migration_notes.md`
- Node operators and Ethereum contract developers: `docs/docs-network/reference/changelog/v{major}.md`

### Step 2: Analyze Branch Changes
### 2. Analyze Branch Changes

Run `git diff next...HEAD --stat` for overview, then `git diff next...HEAD` for details.

Expand All @@ -37,11 +27,11 @@ Run `git diff next...HEAD --stat` for overview, then `git diff next...HEAD` for
- Deprecations
- Configuration changes (CLI flags, environment variables)

### Step 3: Generate Draft Entries
### 3. Generate Draft Entries

Present draft entries for review BEFORE editing files. Match the formatting conventions by reading existing entries in each file.

### Step 4: Edit Documentation
### 4. Edit Documentation

After approval, add entries to the appropriate files.

Expand All @@ -66,44 +56,38 @@ Explanation of what changed.

**Impact**: Effect on existing code.

````

**Component tags:** `[Aztec.nr]`, `[Aztec.js]`, `[PXE]`, `[Aztec Node]`, `[AVM]`, `[L1 Contracts]`, `[CLI]`

## Node Operator Changelog Format

**File:** `docs/docs-network/reference/changelog/v{major}.md`

**Breaking changes:**
```markdown
````markdown
### Feature Name

**v{previous}:**
```bash
--old-flag <value> ($OLD_ENV_VAR)
````
```

**v{current}:**

```bash
--new-flag <value> ($NEW_ENV_VAR)
```

**Migration**: How to migrate.

````

**New features:**
```markdown
````markdown
### Feature Name

```bash
--new-flag <value> ($ENV_VAR)
````
```

Description of the feature.

```
````

**Changed defaults:** Use table format with Flag, Environment Variable, Previous, New columns.
```
60 changes: 46 additions & 14 deletions ci3/cache_download
Original file line number Diff line number Diff line change
Expand Up @@ -36,23 +36,55 @@ else
endpoint="https://build-cache.aztec-labs.com"
fi

if [[ -n "${S3_BUILD_CACHE_AWS_PARAMS:-}" ]]; then
# Use AWS CLI with custom params (e.g., custom endpoint)
# NOTE: This is NOT currently used, but allows for using minio or other S3-compatible storage for tests.
s3_uri="s3://aztec-ci-artifacts/build-cache/$tar_file"
aws $S3_BUILD_CACHE_AWS_PARAMS s3 cp "$s3_uri" "-" | extract_tar
elif [[ -n "${CACHE_SSH_HOST:-}" ]]; then
# Run S3 download on remote host via SSH jump and pipe back
if ! ssh "$CACHE_SSH_HOST" "curl -s -f \"$endpoint/$tar_file\"" | extract_tar; then
echo_stderr "SSH cache download of $tar_file via $CACHE_SSH_HOST failed."
exit 1
# Downloads the artifact from remote to stdout.
function download_from_remote {
if [[ -n "${S3_BUILD_CACHE_AWS_PARAMS:-}" ]]; then
# Use AWS CLI with custom params (e.g., custom endpoint)
# NOTE: This is NOT currently used, but allows for using minio or other S3-compatible storage for tests.
s3_uri="s3://aztec-ci-artifacts/build-cache/$tar_file"
aws $S3_BUILD_CACHE_AWS_PARAMS s3 cp "$s3_uri" "-" 2>/dev/null
elif [[ -n "${CACHE_SSH_HOST:-}" ]]; then
# Run remote download on remote host via SSH jump and pipe back
ssh "$CACHE_SSH_HOST" "curl -s -f \"$endpoint/$tar_file\""
else
# Default to download from remote via curl
curl -s -f "$endpoint/$tar_file"
fi
else
# Default to AWS S3 URL via curl
# Attempt to download and extract the cache file
if ! curl -s -f "$endpoint/$tar_file" | extract_tar; then
}

# Local cache: if CACHE_LOCAL_DIR is set, check local cache first,
# and on miss, download from remote into local cache before extracting.
# If the cache directory cannot be created, skip local caching and fall through.
if [[ -n "${CACHE_LOCAL_DIR:-}" ]] && ! mkdir -p "$CACHE_LOCAL_DIR" 2>/dev/null; then
echo_stderr "Warning: Cannot create local cache dir $CACHE_LOCAL_DIR, skipping local cache."
CACHE_LOCAL_DIR=""
fi

if [[ -n "${CACHE_LOCAL_DIR:-}" ]]; then
local_cache_file="$CACHE_LOCAL_DIR/$tar_file"

if [[ -f "$local_cache_file" ]]; then
echo_stderr "Local cache hit for $tar_file."
extract_tar < "$local_cache_file"
echo_stderr "Cache extraction of $tar_file from local cache complete in ${SECONDS}s."
exit 0
fi

echo_stderr "Local cache miss for $tar_file, downloading from remote."
if ! download_from_remote > "$local_cache_file"; then
rm -f "$local_cache_file"
echo_stderr "Cache download of $tar_file failed."
exit 1
fi

extract_tar < "$local_cache_file"
echo_stderr "Cache download and extraction of $tar_file complete in ${SECONDS}s."
exit 0
fi

if ! download_from_remote | extract_tar; then
echo_stderr "Cache download of $tar_file failed."
exit 1
fi

echo_stderr "Cache download and extraction of $tar_file complete in ${SECONDS}s."
Loading
Loading