Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion barretenberg/cpp/CMakePresets.json
Original file line number Diff line number Diff line change
Expand Up @@ -462,7 +462,9 @@
"binaryDir": "${sourceDir}/build-${presetName}",
"environment": {
"CC": "zig cc",
"CXX": "zig c++"
"CXX": "zig c++",
"CFLAGS": "-g0",
"CXXFLAGS": "-g0"
},
"cacheVariables": {
"ENABLE_PIC": "ON",
Expand Down
2 changes: 1 addition & 1 deletion barretenberg/cpp/cmake/lmdb.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ ExternalProject_Add(
SOURCE_DIR ${LMDB_PREFIX}/src/lmdb_repo
BUILD_IN_SOURCE YES
CONFIGURE_COMMAND "" # No configure step
BUILD_COMMAND ${CMAKE_COMMAND} -E env CC=${CMAKE_C_COMPILER}${CMAKE_C_COMPILER_ARG1} AR=${CMAKE_AR} make -e -C libraries/liblmdb XCFLAGS=-fPIC liblmdb.a
BUILD_COMMAND ${CMAKE_COMMAND} -E env --unset=CFLAGS --unset=CXXFLAGS CC=${CMAKE_C_COMPILER}${CMAKE_C_COMPILER_ARG1} AR=${CMAKE_AR} make -e -C libraries/liblmdb XCFLAGS=-fPIC liblmdb.a
INSTALL_COMMAND ""
UPDATE_COMMAND "" # No update step
BUILD_BYPRODUCTS ${LMDB_LIB}
Expand Down
Binary file not shown.
Binary file not shown.
53 changes: 53 additions & 0 deletions docs/docs-operate/operators/reference/changelog/v4.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,55 @@ The `getL2Tips()` RPC endpoint now returns a restructured response with addition
- Replace `tips.latest` with `tips.proposed`
- For `checkpointed`, `proven`, and `finalized` tips, access block info via `.block` (e.g., `tips.proven.block.number`)

### Block gas limits reworked

The byte-based block size limit has been removed and replaced with field-based blob limits and automatic gas budget computation from L1 rollup limits.

**Removed:**

```bash
--maxBlockSizeInBytes <value> ($SEQ_MAX_BLOCK_SIZE_IN_BYTES)
```

**Changed to optional (now auto-computed from L1 if not set):**

```bash
--maxL2BlockGas <value> ($SEQ_MAX_L2_BLOCK_GAS)
--maxDABlockGas <value> ($SEQ_MAX_DA_BLOCK_GAS)
```

**New:**

```bash
--gasPerBlockAllocationMultiplier <value> ($SEQ_GAS_PER_BLOCK_ALLOCATION_MULTIPLIER)
```

**Migration**: Remove `SEQ_MAX_BLOCK_SIZE_IN_BYTES` from your configuration. Per-block L2 and DA gas budgets are now derived automatically as `(checkpointLimit / maxBlocks) * multiplier`, where the multiplier defaults to 2. You can still override `SEQ_MAX_L2_BLOCK_GAS` and `SEQ_MAX_DA_BLOCK_GAS` explicitly, but they will be capped at the checkpoint-level limits.

### Setup phase allow list requires function selectors

The transaction setup phase allow list now enforces function selectors, restricting which specific functions can run during setup on whitelisted contracts. Previously, any public function on a whitelisted contract or class was permitted.

The semantics of the environment variable `TX_PUBLIC_SETUP_ALLOWLIST` have changed:

**v3.x:**

```bash
--txPublicSetupAllowList <value> ($TX_PUBLIC_SETUP_ALLOWLIST)
```

The variable fully **replaced** the hardcoded defaults. Format allowed entries without selectors: `I:address`, `C:classId`.

**v4.0.0:**

```bash
--txPublicSetupAllowListExtend <value> ($TX_PUBLIC_SETUP_ALLOWLIST)
```

The variable now **extends** the hardcoded defaults (which are always present). Selectors are now mandatory. Format: `I:address:selector,C:classId:selector`.

**Migration**: If you were using `TX_PUBLIC_SETUP_ALLOWLIST`, ensure all entries include function selectors. Note the variable now adds to defaults rather than replacing them. If you were not setting this variable, no action is needed — the hardcoded defaults now include the correct selectors automatically.

## Removed features

## New features
Expand Down Expand Up @@ -149,6 +198,10 @@ P2P_RPC_PRICE_BUMP_PERCENTAGE=10 # default: 10 (percent)

Set to `0` to disable the percentage-based bump (still requires strictly higher fee).

### Setup allow list extendable via network config

The setup phase allow list can now be extended via the network configuration JSON (`txPublicSetupAllowListExtend` field). This allows network operators to distribute additional allowed setup functions to all nodes without requiring code changes. The local environment variable takes precedence over the network-json value.

## Changed defaults

## Troubleshooting
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1064,7 +1064,7 @@ pub global GAS_ESTIMATION_DA_GAS_LIMIT: u32 =
GAS_ESTIMATION_TEARDOWN_DA_GAS_LIMIT + MAX_PROCESSABLE_DA_GAS_PER_CHECKPOINT;

// Default gas limits. Users should use gas estimation, or they will overpay gas fees.
// TODO: consider moving to typescript
// TODO: These are overridden in typescript-land. Remove them from here.
pub global DEFAULT_TEARDOWN_L2_GAS_LIMIT: u32 = 1_000_000; // Arbitrary default number.
pub global DEFAULT_L2_GAS_LIMIT: u32 = MAX_PROCESSABLE_L2_GAS; // Arbitrary default number.
pub global DEFAULT_TEARDOWN_DA_GAS_LIMIT: u32 = MAX_PROCESSABLE_DA_GAS_PER_CHECKPOINT / 2; // Arbitrary default number.
Expand Down
10 changes: 8 additions & 2 deletions yarn-project/archiver/src/archiver.ts
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,11 @@ export class Archiver extends ArchiverDataSourceBase implements L2BlockSink, Tra
},
private readonly blobClient: BlobClientInterface,
instrumentation: ArchiverInstrumentation,
protected override readonly l1Constants: L1RollupConstants & { l1StartBlockHash: Buffer32; genesisArchiveRoot: Fr },
protected override readonly l1Constants: L1RollupConstants & {
l1StartBlockHash: Buffer32;
genesisArchiveRoot: Fr;
rollupManaLimit?: number;
},
synchronizer: ArchiverL1Synchronizer,
events: ArchiverEmitter,
l2TipsCache?: L2TipsCache,
Expand All @@ -133,7 +137,9 @@ export class Archiver extends ArchiverDataSourceBase implements L2BlockSink, Tra
this.synchronizer = synchronizer;
this.events = events;
this.l2TipsCache = l2TipsCache ?? new L2TipsCache(this.dataStore.blockStore);
this.updater = new ArchiverDataStoreUpdater(this.dataStore, this.l2TipsCache);
this.updater = new ArchiverDataStoreUpdater(this.dataStore, this.l2TipsCache, {
rollupManaLimit: l1Constants.rollupManaLimit,
});

// Running promise starts with a small interval inbetween runs, so all iterations needed for the initial sync
// are done as fast as possible. This then gets updated once the initial sync completes.
Expand Down
3 changes: 3 additions & 0 deletions yarn-project/archiver/src/factory.ts
Original file line number Diff line number Diff line change
Expand Up @@ -85,13 +85,15 @@ export async function createArchiver(
genesisArchiveRoot,
slashingProposerAddress,
targetCommitteeSize,
rollupManaLimit,
] = await Promise.all([
rollup.getL1StartBlock(),
rollup.getL1GenesisTime(),
rollup.getProofSubmissionEpochs(),
rollup.getGenesisArchiveTreeRoot(),
rollup.getSlashingProposerAddress(),
rollup.getTargetCommitteeSize(),
rollup.getManaLimit(),
] as const);

const l1StartBlockHash = await publicClient
Expand All @@ -110,6 +112,7 @@ export async function createArchiver(
proofSubmissionEpochs: Number(proofSubmissionEpochs),
targetCommitteeSize,
genesisArchiveRoot: Fr.fromString(genesisArchiveRoot.toString()),
rollupManaLimit: Number(rollupManaLimit),
};

const archiverConfig = merge(
Expand Down
25 changes: 5 additions & 20 deletions yarn-project/archiver/src/modules/data_store_updater.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,17 +5,15 @@ import { ContractClassPublishedEvent } from '@aztec/protocol-contracts/class-reg
import { ContractInstancePublishedEvent } from '@aztec/protocol-contracts/instance-registry';
import { AztecAddress } from '@aztec/stdlib/aztec-address';
import { L2Block } from '@aztec/stdlib/block';
import { Checkpoint } from '@aztec/stdlib/checkpoint';
import { ContractClassLog, PrivateLog } from '@aztec/stdlib/logs';
import { CheckpointHeader } from '@aztec/stdlib/rollup';
import '@aztec/stdlib/testing/jest';

import { readFileSync } from 'fs';
import { dirname, resolve } from 'path';
import { fileURLToPath } from 'url';

import { KVArchiverDataStore } from '../store/kv_archiver_store.js';
import { makePublishedCheckpoint } from '../test/mock_structs.js';
import { makeCheckpoint, makePublishedCheckpoint } from '../test/mock_structs.js';
import { ArchiverDataStoreUpdater } from './data_store_updater.js';

/** Loads the sample ContractClassPublished event payload from protocol-contracts fixtures. */
Expand Down Expand Up @@ -110,12 +108,7 @@ describe('ArchiverDataStoreUpdater', () => {
// Make sure it has a different archive root (which it will by default from random)
expect(conflictingBlock.archive.root.equals(localBlock.archive.root)).toBe(false);

const checkpointWithConflict = new Checkpoint(
conflictingBlock.archive,
CheckpointHeader.random({ slotNumber: SlotNumber(100) }),
[conflictingBlock],
CheckpointNumber(1),
);
const checkpointWithConflict = makeCheckpoint([conflictingBlock]);
const publishedCheckpoint = makePublishedCheckpoint(checkpointWithConflict, 10);

// This should detect the conflict and prune the local block
Expand All @@ -135,8 +128,7 @@ describe('ArchiverDataStoreUpdater', () => {
block.body.txEffects[0].contractClassLogs = [contractClassLog];
block.body.txEffects[0].privateLogs = [PrivateLog.fromBuffer(getSampleContractInstancePublishedEventPayload())];

const checkpoint = new Checkpoint(block.archive, CheckpointHeader.random(), [block], CheckpointNumber(1));
const publishedCheckpoint = makePublishedCheckpoint(checkpoint, 10);
const publishedCheckpoint = makePublishedCheckpoint(makeCheckpoint([block]), 10);

await updater.addCheckpoints([publishedCheckpoint]);

Expand Down Expand Up @@ -166,8 +158,7 @@ describe('ArchiverDataStoreUpdater', () => {
await updater.addProposedBlocks([block]);

// Create checkpoint with the SAME block (same archive root)
const checkpoint = new Checkpoint(block.archive, CheckpointHeader.random(), [block], CheckpointNumber(1));
const publishedCheckpoint = makePublishedCheckpoint(checkpoint, 10);
const publishedCheckpoint = makePublishedCheckpoint(makeCheckpoint([block]), 10);

await updater.addCheckpoints([publishedCheckpoint]);

Expand Down Expand Up @@ -196,13 +187,7 @@ describe('ArchiverDataStoreUpdater', () => {
});
expect(checkpointBlock.archive.root.equals(localBlock.archive.root)).toBe(false);

const checkpoint = new Checkpoint(
checkpointBlock.archive,
CheckpointHeader.random({ slotNumber: SlotNumber(100) }),
[checkpointBlock],
CheckpointNumber(1),
);
await updater.addCheckpoints([makePublishedCheckpoint(checkpoint, 10)]);
await updater.addCheckpoints([makePublishedCheckpoint(makeCheckpoint([checkpointBlock]), 10)]);

// Verify checkpoint block is stored
const storedBlock = await store.getBlock(BlockNumber(1));
Expand Down
7 changes: 6 additions & 1 deletion yarn-project/archiver/src/modules/data_store_updater.ts
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ import {
ContractInstanceUpdatedEvent,
} from '@aztec/protocol-contracts/instance-registry';
import type { L2Block, ValidateCheckpointResult } from '@aztec/stdlib/block';
import type { PublishedCheckpoint } from '@aztec/stdlib/checkpoint';
import { type PublishedCheckpoint, validateCheckpoint } from '@aztec/stdlib/checkpoint';
import {
type ExecutablePrivateFunctionWithMembershipProof,
type UtilityFunctionWithMembershipProof,
Expand Down Expand Up @@ -48,6 +48,7 @@ export class ArchiverDataStoreUpdater {
constructor(
private store: KVArchiverDataStore,
private l2TipsCache?: L2TipsCache,
private opts: { rollupManaLimit?: number } = {},
) {}

/**
Expand Down Expand Up @@ -97,6 +98,10 @@ export class ArchiverDataStoreUpdater {
checkpoints: PublishedCheckpoint[],
pendingChainValidationStatus?: ValidateCheckpointResult,
): Promise<ReconcileCheckpointsResult> {
for (const checkpoint of checkpoints) {
validateCheckpoint(checkpoint.checkpoint, { rollupManaLimit: this.opts?.rollupManaLimit });
}

const result = await this.store.transactionAsync(async () => {
// Before adding checkpoints, check for conflicts with local blocks if any
const { prunedBlocks, lastAlreadyInsertedBlockNumber } = await this.pruneMismatchingLocalBlocks(checkpoints);
Expand Down
10 changes: 8 additions & 2 deletions yarn-project/archiver/src/modules/l1_synchronizer.ts
Original file line number Diff line number Diff line change
Expand Up @@ -69,13 +69,19 @@ export class ArchiverL1Synchronizer implements Traceable {
private readonly epochCache: EpochCache,
private readonly dateProvider: DateProvider,
private readonly instrumentation: ArchiverInstrumentation,
private readonly l1Constants: L1RollupConstants & { l1StartBlockHash: Buffer32; genesisArchiveRoot: Fr },
private readonly l1Constants: L1RollupConstants & {
l1StartBlockHash: Buffer32;
genesisArchiveRoot: Fr;
rollupManaLimit?: number;
},
private readonly events: ArchiverEmitter,
tracer: Tracer,
l2TipsCache?: L2TipsCache,
private readonly log: Logger = createLogger('archiver:l1-sync'),
) {
this.updater = new ArchiverDataStoreUpdater(this.store, l2TipsCache);
this.updater = new ArchiverDataStoreUpdater(this.store, l2TipsCache, {
rollupManaLimit: l1Constants.rollupManaLimit,
});
this.tracer = tracer;
}

Expand Down
26 changes: 20 additions & 6 deletions yarn-project/archiver/src/test/mock_structs.ts
Original file line number Diff line number Diff line change
Expand Up @@ -127,6 +127,25 @@ export function makeL1PublishedData(l1BlockNumber: number): L1PublishedData {
return new L1PublishedData(BigInt(l1BlockNumber), BigInt(l1BlockNumber * 1000), makeBlockHash(l1BlockNumber));
}

/** Creates a Checkpoint from a list of blocks with a header that matches the blocks' structure. */
export function makeCheckpoint(blocks: L2Block[], checkpointNumber = CheckpointNumber(1)): Checkpoint {
const firstBlock = blocks[0];
const { slotNumber, timestamp, coinbase, feeRecipient, gasFees } = firstBlock.header.globalVariables;
return new Checkpoint(
blocks.at(-1)!.archive,
CheckpointHeader.random({
lastArchiveRoot: firstBlock.header.lastArchive.root,
slotNumber,
timestamp,
coinbase,
feeRecipient,
gasFees,
}),
blocks,
checkpointNumber,
);
}

/** Wraps a Checkpoint with L1 published data and random attestations. */
export function makePublishedCheckpoint(
checkpoint: Checkpoint,
Expand Down Expand Up @@ -301,11 +320,6 @@ export async function makeCheckpointWithLogs(
return txEffect;
});

const checkpoint = new Checkpoint(
AppendOnlyTreeSnapshot.random(),
CheckpointHeader.random(),
[block],
CheckpointNumber.fromBlockNumber(BlockNumber(blockNumber)),
);
const checkpoint = makeCheckpoint([block], CheckpointNumber.fromBlockNumber(BlockNumber(blockNumber)));
return makePublishedCheckpoint(checkpoint, blockNumber);
}
19 changes: 10 additions & 9 deletions yarn-project/aztec-node/src/aztec-node/server.ts
Original file line number Diff line number Diff line change
Expand Up @@ -271,10 +271,11 @@ export class AztecNodeService implements AztecNode, AztecNodeAdmin, Traceable {
config.l1Contracts = { ...config.l1Contracts, ...l1ContractsAddresses };

const rollupContract = new RollupContract(publicClient, config.l1Contracts.rollupAddress.toString());
const [l1GenesisTime, slotDuration, rollupVersionFromRollup] = await Promise.all([
const [l1GenesisTime, slotDuration, rollupVersionFromRollup, rollupManaLimit] = await Promise.all([
rollupContract.getL1GenesisTime(),
rollupContract.getSlotDuration(),
rollupContract.getVersion(),
rollupContract.getManaLimit().then(Number),
] as const);

config.rollupVersion ??= Number(rollupVersionFromRollup);
Expand Down Expand Up @@ -342,15 +343,12 @@ export class AztecNodeService implements AztecNode, AztecNodeAdmin, Traceable {
deps.p2pClientDeps,
);

// We should really not be modifying the config object
config.txPublicSetupAllowList = config.txPublicSetupAllowList ?? (await getDefaultAllowedSetupFunctions());

// We'll accumulate sentinel watchers here
const watchers: Watcher[] = [];

// Create FullNodeCheckpointsBuilder for block proposal handling and tx validation
const validatorCheckpointsBuilder = new FullNodeCheckpointsBuilder(
{ ...config, l1GenesisTime, slotDuration: Number(slotDuration) },
{ ...config, l1GenesisTime, slotDuration: Number(slotDuration), rollupManaLimit },
worldStateSynchronizer,
archiver,
dateProvider,
Expand Down Expand Up @@ -487,7 +485,7 @@ export class AztecNodeService implements AztecNode, AztecNodeAdmin, Traceable {

// Create and start the sequencer client
const checkpointsBuilder = new CheckpointsBuilder(
{ ...config, l1GenesisTime, slotDuration: Number(slotDuration) },
{ ...config, l1GenesisTime, slotDuration: Number(slotDuration), rollupManaLimit },
worldStateSynchronizer,
archiver,
dateProvider,
Expand Down Expand Up @@ -618,7 +616,7 @@ export class AztecNodeService implements AztecNode, AztecNodeAdmin, Traceable {
}

public async getAllowedPublicSetup(): Promise<AllowedElement[]> {
return this.config.txPublicSetupAllowList ?? (await getDefaultAllowedSetupFunctions());
return [...(await getDefaultAllowedSetupFunctions()), ...(this.config.txPublicSetupAllowListExtend ?? [])];
}

/**
Expand Down Expand Up @@ -1277,7 +1275,7 @@ export class AztecNodeService implements AztecNode, AztecNodeAdmin, Traceable {
const processor = publicProcessorFactory.create(merkleTreeFork, newGlobalVariables, config);

// REFACTOR: Consider merging ProcessReturnValues into ProcessedTx
const [processedTxs, failedTxs, _usedTxs, returns, _blobFields, debugLogs] = await processor.process([tx]);
const [processedTxs, failedTxs, _usedTxs, returns, debugLogs] = await processor.process([tx]);
// REFACTOR: Consider returning the error rather than throwing
if (failedTxs.length) {
this.log.warn(`Simulated tx ${txHash} fails: ${failedTxs[0].error}`, { txHash });
Expand Down Expand Up @@ -1317,7 +1315,10 @@ export class AztecNodeService implements AztecNode, AztecNodeAdmin, Traceable {
blockNumber,
l1ChainId: this.l1ChainId,
rollupVersion: this.version,
setupAllowList: this.config.txPublicSetupAllowList ?? (await getDefaultAllowedSetupFunctions()),
setupAllowList: [
...(await getDefaultAllowedSetupFunctions()),
...(this.config.txPublicSetupAllowListExtend ?? []),
],
gasFees: await this.getCurrentMinFees(),
skipFeeEnforcement,
txsPermitted: !this.config.disableTransactions,
Expand Down
Original file line number Diff line number Diff line change
@@ -1,8 +1,10 @@
import { DEFAULT_L2_GAS_LIMIT, MAX_PROCESSABLE_DA_GAS_PER_CHECKPOINT } from '@aztec/constants';
import { Fr } from '@aztec/foundation/curves/bn254';
import { EthAddress } from '@aztec/foundation/eth-address';
import { AvmTestContractArtifact } from '@aztec/noir-test-contracts.js/AvmTest';
import { AztecAddress } from '@aztec/stdlib/aztec-address';
import type { ContractInstanceWithAddress } from '@aztec/stdlib/contract';
import { Gas } from '@aztec/stdlib/gas';
import { L2ToL1Message, ScopedL2ToL1Message } from '@aztec/stdlib/messaging';
import { NativeWorldStateService } from '@aztec/world-state';

Expand Down Expand Up @@ -187,9 +189,14 @@ describe('AVM check-circuit – unhappy paths 3', () => {
it(
'a nested exceptional halt is recovered from in caller',
async () => {
// The contract requires >200k DA gas (it allocates da_gas_left - 200_000 to the nested call).
// Use a higher DA gas limit than the default since DEFAULT_DA_GAS_LIMIT is ~196k.
const gasLimits = new Gas(MAX_PROCESSABLE_DA_GAS_PER_CHECKPOINT, DEFAULT_L2_GAS_LIMIT);
await tester.simProveVerifyAppLogic(
{ address: avmTestContractInstance.address, fnName: 'external_call_to_divide_by_zero_recovers', args: [] },
/*expectRevert=*/ false,
/*txLabel=*/ 'unlabeledTx',
gasLimits,
);
},
TIMEOUT,
Expand Down
Loading
Loading