diff --git a/docs/docs/protocol-specs/state/wonky-tree.md b/docs/docs/protocol-specs/state/wonky-tree.md index 14d1a9776754..093e7334e5ee 100644 --- a/docs/docs/protocol-specs/state/wonky-tree.md +++ b/docs/docs/protocol-specs/state/wonky-tree.md @@ -7,7 +7,7 @@ For example, using a balanced merkle tree to rollup 5 transactions requires padd ```mermaid graph BT R_c[Root] - + M4_c[Merge] M5_c[Merge] M4_c --> R_c @@ -27,12 +27,12 @@ graph BT B1_c --> M0_c B2_c --> M1_c B3_c --> M1_c - + M2_c[Merge] M3_c[Merge*] M2_c --> M5_c M3_c --> M5_c - + B4_c[Base] B5_c[Base*] B6_c[Base*] @@ -62,7 +62,7 @@ Our wonky tree implementation instead gives the below structure for 5 transactio ```mermaid graph BT R_c[Root] - + M4_c[Merge] M4_c --> R_c @@ -80,8 +80,8 @@ graph BT B1_c --> M0_c B2_c --> M1_c B3_c --> M1_c - - + + B4_c[Base] B4_c --> R_c @@ -115,7 +115,7 @@ graph graph BT M0_c[Merge 0] M1_c[Merge 1] - + B0_c[Base 0] B1_c[Base 1] B2_c[Base 2] @@ -124,7 +124,7 @@ graph BT B1_c --> M0_c B2_c --> M1_c B3_c --> M1_c - + B4_c[Base 4] ``` @@ -135,7 +135,7 @@ graph BT M0_c[Merge 0] M1_c[Merge 1] M2_c[Merge 2] - + B0_c[Base 0] B1_c[Base 1] B2_c[Base 2] @@ -144,10 +144,10 @@ graph BT B1_c --> M0_c B2_c --> M1_c B3_c --> M1_c - + M0_c --> M2_c M1_c --> M2_c - + B4_c[Base 4] ``` @@ -156,11 +156,11 @@ Once paired, the base layer has length 4, the next merge layer has 2, and the fi ```mermaid graph BT R_c[Root] - + M0_c[Merge 0] M1_c[Merge 1] M2_c[Merge 2] - + B0_c[Base 0] B1_c[Base 1] B2_c[Base 2] @@ -169,17 +169,18 @@ graph BT B1_c --> M0_c B2_c --> M1_c B3_c --> M1_c - + M0_c --> M2_c M1_c --> M2_c - + B4_c[Base 4] M2_c --> R_c B4_c --> R_c ``` + Since we have processed all base circuits, this final pair will be input to a root circuit. -Filling from left to right means that we can easily reconstruct the tree only from the number of transactions `n`. The above method ensures that the final tree is a combination of *balanced* subtrees of descending size. The widths of these subtrees are given by the decomposition of `n` into powers of 2. For example, 5 transactions: +Filling from left to right means that we can easily reconstruct the tree only from the number of transactions `n`. The above method ensures that the final tree is a combination of _balanced_ subtrees of descending size. The widths of these subtrees are given by the decomposition of `n` into powers of 2. For example, 5 transactions: ``` Subtrees: [4, 1] -> @@ -189,6 +190,7 @@ Subtrees: [4, 1] -> ``` For 31 transactions: + ``` Subtrees: [16, 8, 4, 2, 1] -> Merge D: left_subtree_root = balanced_tree(txs[0..16]) @@ -207,6 +209,7 @@ Subtrees: [16, 8, 4, 2, 1] -> } root = left_subtree_root | right_subtree_root ``` + An unrolled recursive algorithm is not the easiest thing to read. This diagram represents the 31 transactions rolled up in our wonky structure, where each `Merge ` is a 'subroot' above: ```mermaid @@ -215,36 +218,37 @@ graph BT M3_c[Merge D Subtree of 16 txs] R_c[Root] - - + + B4_c[Merge C Subtree of 8 txs] B5_c[Merge 1] - + B4_c --> M2_c B5_c --> M2_c - + B6_c[Merge B Subtree of 4 txs] B7_c[Merge 0] - + B6_c --> B5_c B7_c --> B5_c - + B8_c[Merge A Subtree of 2 txs] B9_c[Base 30] - + B8_c --> B7_c B9_c --> B7_c - + M3_c --> R_c M2_c --> R_c ``` + The tree is reconstructed to check the `txs_effects_hash` (= the root of a wonky tree given by leaves of each tx's `tx_effects`) on L1. We also reconstruct it to provide a membership path against the stored `out_hash` (= the root of a wonky tree given by leaves of each tx's L2 to L1 message tree root) for consuming a L2 to L1 message. -Currently, this tree is built via the [orchestrator](../../../../yarn-project/prover-client/src/orchestrator/proving-state.ts#74) given the number of transactions to rollup (`this.totalNumTxs`). Each 'node' is assigned a level (0 at the root) and index in that level. The below function finds the parent level: +Currently, this tree is built via the orchestrator given the number of transactions to rollup. Each 'node' is assigned a level (0 at the root) and index in that level. The below function finds the parent level: ``` // Calculates the index and level of the parent rollup circuit @@ -272,14 +276,14 @@ Currently, this tree is built via the [orchestrator](../../../../yarn-project/pr return [mergeLevel - 1n, thisIndex >> 1n, thisIndex & 1n]; } ``` - For example, `Base 4` above starts with `level = 3` and `index = 4`. Since we have an odd number of transactions at this level, `thisLevelSize` is set to 4 with `shiftUp = true`. - The while loop triggers and shifts up our node to `level = 2` and `index = 2`. This level (containing `Merge 0` and `Merge 1`) is of even length, so the loop continues. The next iteration shifts up to `level = 1` and `index = 1` - we now have an odd level, so the loop stops. The actual position of `Base 4` is therefore at `level = 1` and `index = 1`. This function returns the parent level of the input node, so we return `level = 0`, `index = 0`, correctly indicating that the parent of `Base 4` is the root. +For example, `Base 4` above starts with `level = 3` and `index = 4`. Since we have an odd number of transactions at this level, `thisLevelSize` is set to 4 with `shiftUp = true`. +The while loop triggers and shifts up our node to `level = 2` and `index = 2`. This level (containing `Merge 0` and `Merge 1`) is of even length, so the loop continues. The next iteration shifts up to `level = 1` and `index = 1` - we now have an odd level, so the loop stops. The actual position of `Base 4` is therefore at `level = 1` and `index = 1`. This function returns the parent level of the input node, so we return `level = 0`, `index = 0`, correctly indicating that the parent of `Base 4` is the root. ### Flexible wonky trees -We can also encode the structure of *any* binary merkle tree by tracking `number_of_branches` and `number_of_leaves` for each node in the tree. This encoding was originally designed for [logs](../logs/index.md) before they were included in the `txs_effects_hash`, so the below explanation references the leaves stored in relation to logs and transactions. +We can also encode the structure of _any_ binary merkle tree by tracking `number_of_branches` and `number_of_leaves` for each node in the tree. This encoding was originally designed for [logs](../logs/index.md) before they were included in the `txs_effects_hash`, so the below explanation references the leaves stored in relation to logs and transactions. The benefit of this method as opposed to the one above is allowing for any binary structure and therefore allowing for 'skipping' leaves with no information. However, the encoding grows as the tree grows, by at least 2 bytes per node. The above implementation only requires the number of leaves to be encoded, which will likely only require a single field to store. @@ -419,4 +423,4 @@ function hash_tx_logs_data(logs_data) { } return res; } -``` \ No newline at end of file +``` diff --git a/l1-contracts/src/core/Rollup.sol b/l1-contracts/src/core/Rollup.sol index bd6cbe0246af..73cdccaac44e 100644 --- a/l1-contracts/src/core/Rollup.sol +++ b/l1-contracts/src/core/Rollup.sol @@ -365,10 +365,8 @@ contract Rollup is Leonidas, IRollup, ITestRollup { // new_archive.next_available_leaf_index: the new archive next available index publicInputs[3] = bytes32(header.globalVariables.blockNumber + 1); - // TODO(#7346): Currently previous block hash is unchecked, but will be checked in batch rollup (block merge -> root). - // block-building-helpers.ts is injecting as 0 for now, replicating here. // previous_block_hash: the block hash just preceding this block (will eventually become the end_block_hash of the prev batch) - publicInputs[4] = bytes32(0); + publicInputs[4] = blocks[header.globalVariables.blockNumber - 1].blockHash; // end_block_hash: the current block hash (will eventually become the hash of the final block proven in a batch) publicInputs[5] = blocks[header.globalVariables.blockNumber].blockHash; diff --git a/l1-contracts/src/core/libraries/ConstantsGen.sol b/l1-contracts/src/core/libraries/ConstantsGen.sol index ef12b666e100..3d488ae07d9a 100644 --- a/l1-contracts/src/core/libraries/ConstantsGen.sol +++ b/l1-contracts/src/core/libraries/ConstantsGen.sol @@ -93,6 +93,7 @@ library Constants { uint256 internal constant BLOCK_ROOT_ROLLUP_INDEX = 22; uint256 internal constant BLOCK_MERGE_ROLLUP_INDEX = 23; uint256 internal constant ROOT_ROLLUP_INDEX = 24; + uint256 internal constant BLOCK_ROOT_ROLLUP_FINAL_INDEX = 25; uint256 internal constant FUNCTION_SELECTOR_NUM_BYTES = 4; uint256 internal constant INITIALIZATION_SLOT_SEPARATOR = 1000000000; uint256 internal constant INITIAL_L2_BLOCK_NUM = 1; diff --git a/noir-projects/noir-protocol-circuits/Nargo.template.toml b/noir-projects/noir-protocol-circuits/Nargo.template.toml index e575b7ea7425..7d647db4b0a1 100644 --- a/noir-projects/noir-protocol-circuits/Nargo.template.toml +++ b/noir-projects/noir-protocol-circuits/Nargo.template.toml @@ -34,5 +34,6 @@ members = [ "crates/rollup-base-simulated", "crates/rollup-block-merge", "crates/rollup-block-root", + "crates/rollup-block-root-final", "crates/rollup-root", ] diff --git a/noir-projects/noir-protocol-circuits/crates/rollup-block-root-final/Nargo.toml b/noir-projects/noir-protocol-circuits/crates/rollup-block-root-final/Nargo.toml new file mode 100644 index 000000000000..2d827d6e1677 --- /dev/null +++ b/noir-projects/noir-protocol-circuits/crates/rollup-block-root-final/Nargo.toml @@ -0,0 +1,9 @@ +[package] +name = "rollup_block_root_final" +type = "bin" +authors = [""] +compiler_version = ">=0.18.0" + +[dependencies] +rollup_lib = { path = "../rollup-lib" } +types = { path = "../types" } diff --git a/noir-projects/noir-protocol-circuits/crates/rollup-block-root-final/src/main.nr b/noir-projects/noir-protocol-circuits/crates/rollup-block-root-final/src/main.nr new file mode 100644 index 000000000000..b23b532e957b --- /dev/null +++ b/noir-projects/noir-protocol-circuits/crates/rollup-block-root-final/src/main.nr @@ -0,0 +1,7 @@ +use dep::rollup_lib::block_root::{BlockRootRollupInputs, BlockRootOrBlockMergePublicInputs}; + +// This is a non-recursive variant of the rollup-block-root. We use it so we can generate proofs that can be verified on L1, until we +// drop support for proving single blocks and move to epoch proving completely. +fn main(inputs: BlockRootRollupInputs) -> pub BlockRootOrBlockMergePublicInputs { + inputs.block_root_rollup_circuit() +} diff --git a/noir-projects/noir-protocol-circuits/crates/rollup-block-root/src/main.nr b/noir-projects/noir-protocol-circuits/crates/rollup-block-root/src/main.nr index d5e7a5e691db..f4ce060103fa 100644 --- a/noir-projects/noir-protocol-circuits/crates/rollup-block-root/src/main.nr +++ b/noir-projects/noir-protocol-circuits/crates/rollup-block-root/src/main.nr @@ -1,5 +1,6 @@ use dep::rollup_lib::block_root::{BlockRootRollupInputs, BlockRootOrBlockMergePublicInputs}; +#[recursive] fn main(inputs: BlockRootRollupInputs) -> pub BlockRootOrBlockMergePublicInputs { inputs.block_root_rollup_circuit() } diff --git a/noir-projects/noir-protocol-circuits/crates/types/src/constants.nr b/noir-projects/noir-protocol-circuits/crates/types/src/constants.nr index 47ccd7c840fa..60217db59719 100644 --- a/noir-projects/noir-protocol-circuits/crates/types/src/constants.nr +++ b/noir-projects/noir-protocol-circuits/crates/types/src/constants.nr @@ -119,6 +119,7 @@ global MERGE_ROLLUP_INDEX: u32 = 21; global BLOCK_ROOT_ROLLUP_INDEX: u32 = 22; global BLOCK_MERGE_ROLLUP_INDEX: u32 = 23; global ROOT_ROLLUP_INDEX: u32 = 24; +global BLOCK_ROOT_ROLLUP_FINAL_INDEX: u32 = 25; // MISC CONSTANTS global FUNCTION_SELECTOR_NUM_BYTES: Field = 4; diff --git a/noir-projects/noir-protocol-circuits/crates/types/src/tests/fixtures/vk_tree.nr b/noir-projects/noir-protocol-circuits/crates/types/src/tests/fixtures/vk_tree.nr index cb95277f7f28..75e24d3c28dd 100644 --- a/noir-projects/noir-protocol-circuits/crates/types/src/tests/fixtures/vk_tree.nr +++ b/noir-projects/noir-protocol-circuits/crates/types/src/tests/fixtures/vk_tree.nr @@ -5,7 +5,7 @@ use crate::constants::{ EMPTY_NESTED_INDEX, PRIVATE_KERNEL_EMPTY_INDEX, PUBLIC_KERNEL_INNER_INDEX, PUBLIC_KERNEL_MERGE_INDEX, PUBLIC_KERNEL_TAIL_INDEX, BASE_PARITY_INDEX, ROOT_PARITY_INDEX, BASE_ROLLUP_INDEX, MERGE_ROLLUP_INDEX, BLOCK_ROOT_ROLLUP_INDEX, BLOCK_MERGE_ROLLUP_INDEX, - ROOT_ROLLUP_INDEX, PRIVATE_KERNEL_RESET_TINY_INDEX + ROOT_ROLLUP_INDEX, PRIVATE_KERNEL_RESET_TINY_INDEX, BLOCK_ROOT_ROLLUP_FINAL_INDEX }; use crate::merkle_tree::merkle_tree::MerkleTree; @@ -41,6 +41,7 @@ pub fn get_vk_merkle_tree() -> MerkleTree { leaves[BLOCK_ROOT_ROLLUP_INDEX] = 22; leaves[BLOCK_MERGE_ROLLUP_INDEX] = 23; leaves[ROOT_ROLLUP_INDEX] = 24; + leaves[BLOCK_ROOT_ROLLUP_FINAL_INDEX] = 25; MerkleTree::new(leaves) } diff --git a/yarn-project/bb-prover/src/honk.ts b/yarn-project/bb-prover/src/honk.ts index 8c13ed144759..93c72cae6e82 100644 --- a/yarn-project/bb-prover/src/honk.ts +++ b/yarn-project/bb-prover/src/honk.ts @@ -2,7 +2,10 @@ import { type ProtocolArtifact } from '@aztec/noir-protocol-circuits-types'; export type UltraHonkFlavor = 'ultra_honk' | 'ultra_keccak_honk'; -const UltraKeccakHonkCircuits = ['BlockRootRollupArtifact'] as const; +const UltraKeccakHonkCircuits = [ + 'BlockRootRollupFinalArtifact', + 'RootRollupArtifact', +] as const satisfies ProtocolArtifact[]; type UltraKeccakHonkCircuits = (typeof UltraKeccakHonkCircuits)[number]; type UltraHonkCircuits = Exclude; diff --git a/yarn-project/bb-prover/src/index.ts b/yarn-project/bb-prover/src/index.ts index 303e83125190..e8914146199c 100644 --- a/yarn-project/bb-prover/src/index.ts +++ b/yarn-project/bb-prover/src/index.ts @@ -3,3 +3,5 @@ export * from './test/index.js'; export * from './verifier/index.js'; export * from './config.js'; export * from './bb/execute.js'; + +export { type ClientProtocolCircuitVerifier } from '@aztec/circuit-types'; diff --git a/yarn-project/bb-prover/src/prover/bb_prover.ts b/yarn-project/bb-prover/src/prover/bb_prover.ts index 966b7edcc953..5aff5612630f 100644 --- a/yarn-project/bb-prover/src/prover/bb_prover.ts +++ b/yarn-project/bb-prover/src/prover/bb_prover.ts @@ -365,20 +365,43 @@ export class BBNativeRollupProver implements ServerCircuitProver { public async getBlockRootRollupProof( input: BlockRootRollupInputs, ): Promise> { - // TODO(#7346): When batch rollups are integrated, we probably want the below to be this.createRecursiveProof - // since we will no longer be verifying it directly on L1 - const { circuitOutput, proof } = await this.createProof( + const { circuitOutput, proof } = await this.createRecursiveProof( input, 'BlockRootRollupArtifact', + NESTED_RECURSIVE_PROOF_LENGTH, + convertBlockRootRollupInputsToWitnessMap, + convertBlockRootRollupOutputsFromWitnessMap, + ); + + const verificationKey = await this.getVerificationKeyDataForCircuit('BlockRootRollupArtifact'); + + await this.verifyProof('BlockRootRollupArtifact', proof.binaryProof); + + return makePublicInputsAndRecursiveProof(circuitOutput, proof, verificationKey); + } + + /** + * Simulates the block root rollup circuit from its inputs. + * Returns a non-recursive proof to verify on L1. + * @dev TODO(palla/prover): This is a temporary workaround to get the proof to L1 with the old block flow. + * @param input - Inputs to the circuit. + * @returns The public inputs as outputs of the simulation. + */ + public async getBlockRootRollupFinalProof( + input: BlockRootRollupInputs, + ): Promise> { + const { circuitOutput, proof } = await this.createProof( + input, + 'BlockRootRollupFinalArtifact', convertBlockRootRollupInputsToWitnessMap, convertBlockRootRollupOutputsFromWitnessMap, ); const recursiveProof = makeRecursiveProofFromBinary(proof, NESTED_RECURSIVE_PROOF_LENGTH); - const verificationKey = await this.getVerificationKeyDataForCircuit('BlockRootRollupArtifact'); + const verificationKey = await this.getVerificationKeyDataForCircuit('BlockRootRollupFinalArtifact'); - await this.verifyProof('BlockRootRollupArtifact', proof); + await this.verifyProof('BlockRootRollupFinalArtifact', proof); return makePublicInputsAndRecursiveProof(circuitOutput, recursiveProof, verificationKey); } diff --git a/yarn-project/bb-prover/src/stats.ts b/yarn-project/bb-prover/src/stats.ts index 41058c9a9156..92bd1b38e65a 100644 --- a/yarn-project/bb-prover/src/stats.ts +++ b/yarn-project/bb-prover/src/stats.ts @@ -49,6 +49,8 @@ export function mapProtocolArtifactNameToCircuitName( return 'empty-nested'; case 'PrivateKernelEmptyArtifact': return 'private-kernel-empty'; + case 'BlockRootRollupFinalArtifact': + return 'block-root-rollup-final'; default: { const _foo: never = artifact; throw new Error(`Unknown circuit type: ${artifact}`); diff --git a/yarn-project/bb-prover/src/test/test_circuit_prover.ts b/yarn-project/bb-prover/src/test/test_circuit_prover.ts index 6531a7a39e29..6ccfb948d924 100644 --- a/yarn-project/bb-prover/src/test/test_circuit_prover.ts +++ b/yarn-project/bb-prover/src/test/test_circuit_prover.ts @@ -347,6 +347,12 @@ export class TestCircuitProver implements ServerCircuitProver { ); } + public getBlockRootRollupFinalProof( + input: BlockRootRollupInputs, + ): Promise> { + return this.getBlockRootRollupProof(input); + } + /** * Simulates the block merge rollup circuit from its inputs. * @param input - Inputs to the circuit. diff --git a/yarn-project/circuit-types/src/interfaces/block-prover.ts b/yarn-project/circuit-types/src/interfaces/block-prover.ts index 4823fcbc5050..8ef43b177335 100644 --- a/yarn-project/circuit-types/src/interfaces/block-prover.ts +++ b/yarn-project/circuit-types/src/interfaces/block-prover.ts @@ -52,7 +52,7 @@ export interface BlockSimulator extends ProcessedTxHandler { startNewBlock(numTxs: number, globalVariables: GlobalVariables, l1ToL2Messages: Fr[]): Promise; /** Cancels the block currently being processed. Processes already in progress built may continue but further proofs should not be started. */ - cancelBlock(): void; + cancel(): void; /** Performs the final archive tree insertion for this block and returns the L2Block. */ finaliseBlock(): Promise; @@ -72,3 +72,7 @@ export interface BlockProver extends BlockSimulator { /** Performs the final archive tree insertion for this block and returns the L2Block. */ finaliseBlock(): Promise; } + +export interface EpochProver extends BlockProver { + startNewEpoch(epochNumber: number, totalNumBlocks: number): ProvingTicket; +} diff --git a/yarn-project/circuit-types/src/interfaces/proving-job.ts b/yarn-project/circuit-types/src/interfaces/proving-job.ts index 0482e91abcb9..c805b4cb80eb 100644 --- a/yarn-project/circuit-types/src/interfaces/proving-job.ts +++ b/yarn-project/circuit-types/src/interfaces/proving-job.ts @@ -76,6 +76,7 @@ export enum ProvingRequestType { BASE_ROLLUP, MERGE_ROLLUP, BLOCK_ROOT_ROLLUP, + BLOCK_ROOT_ROLLUP_FINAL, BLOCK_MERGE_ROLLUP, ROOT_ROLLUP, @@ -103,6 +104,8 @@ export function mapProvingRequestTypeToCircuitName(type: ProvingRequestType): Ci return 'merge-rollup'; case ProvingRequestType.BLOCK_ROOT_ROLLUP: return 'block-root-rollup'; + case ProvingRequestType.BLOCK_ROOT_ROLLUP_FINAL: + return 'block-root-rollup-final'; case ProvingRequestType.BLOCK_MERGE_ROLLUP: return 'block-merge-rollup'; case ProvingRequestType.ROOT_ROLLUP: @@ -161,6 +164,10 @@ export type ProvingRequest = type: ProvingRequestType.BLOCK_ROOT_ROLLUP; inputs: BlockRootRollupInputs; } + | { + type: ProvingRequestType.BLOCK_ROOT_ROLLUP_FINAL; + inputs: BlockRootRollupInputs; + } | { type: ProvingRequestType.BLOCK_MERGE_ROLLUP; inputs: BlockMergeRollupInputs; @@ -189,6 +196,7 @@ export type ProvingRequestPublicInputs = { [ProvingRequestType.BASE_ROLLUP]: PublicInputsAndRecursiveProof; [ProvingRequestType.MERGE_ROLLUP]: PublicInputsAndRecursiveProof; [ProvingRequestType.BLOCK_ROOT_ROLLUP]: PublicInputsAndRecursiveProof; + [ProvingRequestType.BLOCK_ROOT_ROLLUP_FINAL]: PublicInputsAndRecursiveProof; [ProvingRequestType.BLOCK_MERGE_ROLLUP]: PublicInputsAndRecursiveProof; [ProvingRequestType.ROOT_ROLLUP]: PublicInputsAndRecursiveProof; diff --git a/yarn-project/circuit-types/src/interfaces/server_circuit_prover.ts b/yarn-project/circuit-types/src/interfaces/server_circuit_prover.ts index b73d7cd34f79..741b05e25f5c 100644 --- a/yarn-project/circuit-types/src/interfaces/server_circuit_prover.ts +++ b/yarn-project/circuit-types/src/interfaces/server_circuit_prover.ts @@ -95,6 +95,16 @@ export interface ServerCircuitProver { epochNumber?: number, ): Promise>; + /** + * Creates a proof for the given input. + * @param input - Input to the circuit. + */ + getBlockRootRollupFinalProof( + input: BlockRootRollupInputs, + signal?: AbortSignal, + epochNumber?: number, + ): Promise>; + /** * Creates a proof for the given input. * @param input - Input to the circuit. diff --git a/yarn-project/circuit-types/src/stats/stats.ts b/yarn-project/circuit-types/src/stats/stats.ts index fd026b5d3689..30dcb9f460a9 100644 --- a/yarn-project/circuit-types/src/stats/stats.ts +++ b/yarn-project/circuit-types/src/stats/stats.ts @@ -77,6 +77,7 @@ export type CircuitName = | 'base-rollup' | 'merge-rollup' | 'block-root-rollup' + | 'block-root-rollup-final' | 'block-merge-rollup' | 'root-rollup' | 'private-kernel-init' diff --git a/yarn-project/circuits.js/src/constants.gen.ts b/yarn-project/circuits.js/src/constants.gen.ts index 0c3b3acac640..2737fa9c6c5a 100644 --- a/yarn-project/circuits.js/src/constants.gen.ts +++ b/yarn-project/circuits.js/src/constants.gen.ts @@ -79,6 +79,7 @@ export const MERGE_ROLLUP_INDEX = 21; export const BLOCK_ROOT_ROLLUP_INDEX = 22; export const BLOCK_MERGE_ROLLUP_INDEX = 23; export const ROOT_ROLLUP_INDEX = 24; +export const BLOCK_ROOT_ROLLUP_FINAL_INDEX = 25; export const FUNCTION_SELECTOR_NUM_BYTES = 4; export const INITIALIZATION_SLOT_SEPARATOR = 1000000000; export const INITIAL_L2_BLOCK_NUM = 1; diff --git a/yarn-project/circuits.js/src/structs/global_variables.ts b/yarn-project/circuits.js/src/structs/global_variables.ts index 50e88a61592c..fbf567552304 100644 --- a/yarn-project/circuits.js/src/structs/global_variables.ts +++ b/yarn-project/circuits.js/src/structs/global_variables.ts @@ -4,6 +4,8 @@ import { Fr } from '@aztec/foundation/fields'; import { BufferReader, FieldReader, serializeToBuffer, serializeToFields } from '@aztec/foundation/serialize'; import { type FieldsOf } from '@aztec/foundation/types'; +import { inspect } from 'util'; + import { GLOBAL_VARIABLES_LENGTH } from '../constants.gen.js'; import { GasFees } from './gas_fees.js'; @@ -150,4 +152,10 @@ export class GlobalVariables { this.gasFees.isEmpty() ); } + + [inspect.custom]() { + return `GlobalVariables { chainId: ${this.chainId.toString()}, version: ${this.version.toString()}, blockNumber: ${this.blockNumber.toString()}, slotNumber: ${this.slotNumber.toString()}, timestamp: ${this.timestamp.toString()}, coinbase: ${this.coinbase.toString()}, feeRecipient: ${this.feeRecipient.toString()}, gasFees: ${inspect( + this.gasFees, + )} }`; + } } diff --git a/yarn-project/circuits.js/src/structs/header.ts b/yarn-project/circuits.js/src/structs/header.ts index dbdb09881982..0b06dc7cfd38 100644 --- a/yarn-project/circuits.js/src/structs/header.ts +++ b/yarn-project/circuits.js/src/structs/header.ts @@ -3,6 +3,8 @@ import { Fr } from '@aztec/foundation/fields'; import { BufferReader, FieldReader, serializeToBuffer, serializeToFields } from '@aztec/foundation/serialize'; import { type FieldsOf } from '@aztec/foundation/types'; +import { inspect } from 'util'; + import { GeneratorIndex, HEADER_LENGTH } from '../constants.gen.js'; import { ContentCommitment } from './content_commitment.js'; import { GlobalVariables } from './global_variables.js'; @@ -125,4 +127,20 @@ export class Header { hash(): Fr { return poseidon2HashWithSeparator(this.toFields(), GeneratorIndex.BLOCK_HASH); } + + [inspect.custom]() { + return `Header { + lastArchive: ${inspect(this.lastArchive)}, + contentCommitment.numTx: ${this.contentCommitment.numTxs.toNumber()}, + contentCommitment.txsEffectsHash: ${this.contentCommitment.txsEffectsHash.toString('hex')}, + contentCommitment.inHash: ${this.contentCommitment.inHash.toString('hex')}, + contentCommitment.outHash: ${this.contentCommitment.outHash.toString('hex')}, + state.l1ToL2MessageTree: ${inspect(this.state.l1ToL2MessageTree)}, + state.noteHashTree: ${inspect(this.state.partial.noteHashTree)}, + state.nullifierTree: ${inspect(this.state.partial.nullifierTree)}, + state.publicDataTree: ${inspect(this.state.partial.publicDataTree)}, + globalVariables: ${inspect(this.globalVariables)}, + totalFees: ${this.totalFees}, +}`; + } } diff --git a/yarn-project/circuits.js/src/structs/rollup/append_only_tree_snapshot.ts b/yarn-project/circuits.js/src/structs/rollup/append_only_tree_snapshot.ts index c831cbef31f4..e1256fb3cb0c 100644 --- a/yarn-project/circuits.js/src/structs/rollup/append_only_tree_snapshot.ts +++ b/yarn-project/circuits.js/src/structs/rollup/append_only_tree_snapshot.ts @@ -1,6 +1,8 @@ import { Fr } from '@aztec/foundation/fields'; import { BufferReader, FieldReader, serializeToBuffer } from '@aztec/foundation/serialize'; +import { inspect } from 'util'; + import { STRING_ENCODING, type UInt32 } from '../shared.js'; /** @@ -64,4 +66,10 @@ export class AppendOnlyTreeSnapshot { isZero(): boolean { return this.root.isZero() && this.nextAvailableLeafIndex === 0; } + + [inspect.custom]() { + return `AppendOnlyTreeSnapshot { root: ${this.root.toString()}, nextAvailableLeafIndex: ${ + this.nextAvailableLeafIndex + } }`; + } } diff --git a/yarn-project/circuits.js/src/structs/rollup/block_root_or_block_merge_public_inputs.ts b/yarn-project/circuits.js/src/structs/rollup/block_root_or_block_merge_public_inputs.ts index 193acbff4852..48293c44896c 100644 --- a/yarn-project/circuits.js/src/structs/rollup/block_root_or_block_merge_public_inputs.ts +++ b/yarn-project/circuits.js/src/structs/rollup/block_root_or_block_merge_public_inputs.ts @@ -1,9 +1,9 @@ +import { EthAddress } from '@aztec/foundation/eth-address'; import { Fr } from '@aztec/foundation/fields'; import { BufferReader, type Tuple, serializeToBuffer, serializeToFields } from '@aztec/foundation/serialize'; import { type FieldsOf } from '@aztec/foundation/types'; import { GlobalVariables } from '../global_variables.js'; -import { EthAddress } from '../index.js'; import { AppendOnlyTreeSnapshot } from './append_only_tree_snapshot.js'; /** @@ -131,4 +131,15 @@ export class FeeRecipient { toFields() { return serializeToFields(...FeeRecipient.getFields(this)); } + + isEmpty() { + return this.value.isZero() && this.recipient.isZero(); + } + + toFriendlyJSON() { + if (this.isEmpty()) { + return {}; + } + return { recipient: this.recipient.toString(), value: this.value.toString() }; + } } diff --git a/yarn-project/circuits.js/src/structs/rollup/block_root_rollup.ts b/yarn-project/circuits.js/src/structs/rollup/block_root_rollup.ts index 18de9e20c8d1..0a4eece23745 100644 --- a/yarn-project/circuits.js/src/structs/rollup/block_root_rollup.ts +++ b/yarn-project/circuits.js/src/structs/rollup/block_root_rollup.ts @@ -47,7 +47,6 @@ export class BlockRootRollupInputs { public newArchiveSiblingPath: Tuple, /** * The hash of the block preceding this one. - * TODO(#7346): Integrate batch rollup circuits and inject below */ public previousBlockHash: Fr, /** diff --git a/yarn-project/circuits.js/src/structs/state_reference.ts b/yarn-project/circuits.js/src/structs/state_reference.ts index 5aa48b38c1c9..3253371929cc 100644 --- a/yarn-project/circuits.js/src/structs/state_reference.ts +++ b/yarn-project/circuits.js/src/structs/state_reference.ts @@ -1,6 +1,8 @@ import { type Fr } from '@aztec/foundation/fields'; import { BufferReader, FieldReader, serializeToBuffer } from '@aztec/foundation/serialize'; +import { inspect } from 'util'; + import { STATE_REFERENCE_LENGTH } from '../constants.gen.js'; import { PartialStateReference } from './partial_state_reference.js'; import { AppendOnlyTreeSnapshot } from './rollup/append_only_tree_snapshot.js'; @@ -56,4 +58,13 @@ export class StateReference { isEmpty(): boolean { return this.l1ToL2MessageTree.isZero() && this.partial.isEmpty(); } + + [inspect.custom]() { + return `StateReference { + l1ToL2MessageTree: ${inspect(this.l1ToL2MessageTree)}, + noteHashTree: ${inspect(this.partial.noteHashTree)}, + nullifierTree: ${inspect(this.partial.nullifierTree)}, + publicDataTree: ${inspect(this.partial.publicDataTree)}, +}`; + } } diff --git a/yarn-project/cli/src/cmds/l1/deploy_l1_verifier.ts b/yarn-project/cli/src/cmds/l1/deploy_l1_verifier.ts index cc346d98cdad..656986f5b580 100644 --- a/yarn-project/cli/src/cmds/l1/deploy_l1_verifier.ts +++ b/yarn-project/cli/src/cmds/l1/deploy_l1_verifier.ts @@ -27,7 +27,7 @@ export async function deployUltraHonkVerifier( const circuitVerifier = await BBCircuitVerifier.new({ bbBinaryPath, bbWorkingDirectory }); const contractSrc = await circuitVerifier.generateSolidityContract( - 'BlockRootRollupArtifact', + 'BlockRootRollupFinalArtifact', 'UltraHonkVerifier.sol', ); log('Generated UltraHonkVerifier contract'); diff --git a/yarn-project/end-to-end/src/composed/integration_l1_publisher.test.ts b/yarn-project/end-to-end/src/composed/integration_l1_publisher.test.ts index b8911b0ccd94..06709cf70285 100644 --- a/yarn-project/end-to-end/src/composed/integration_l1_publisher.test.ts +++ b/yarn-project/end-to-end/src/composed/integration_l1_publisher.test.ts @@ -11,7 +11,7 @@ import { } from '@aztec/aztec.js'; // eslint-disable-next-line no-restricted-imports import { - type BlockProver, + type BlockSimulator, PROVING_STATUS, type ProcessedTx, makeEmptyProcessedTx as makeEmptyProcessedTxFromHistoricalTreeRoots, @@ -40,7 +40,6 @@ import { openTmpStore } from '@aztec/kv-store/utils'; import { OutboxAbi, RollupAbi } from '@aztec/l1-artifacts'; import { SHA256Trunc, StandardTree } from '@aztec/merkle-tree'; import { getVKTreeRoot } from '@aztec/noir-protocol-circuits-types'; -import { TxProver } from '@aztec/prover-client'; import { L1Publisher } from '@aztec/sequencer-client'; import { NoopTelemetryClient } from '@aztec/telemetry-client/noop'; import { MerkleTrees, ServerWorldStateSynchronizer, type WorldStateConfig } from '@aztec/world-state'; @@ -63,6 +62,7 @@ import { } from 'viem'; import { type PrivateKeyAccount, privateKeyToAccount } from 'viem/accounts'; +import { LightweightBlockBuilder } from '../../../sequencer-client/src/block_builder/light.js'; import { sendL1ToL2Message } from '../fixtures/l1_to_l2_messaging.js'; import { setupL1Contracts } from '../fixtures/utils.js'; @@ -90,9 +90,8 @@ describe('L1Publisher integration', () => { let publisher: L1Publisher; - let builder: TxProver; + let builder: BlockSimulator; let builderDb: MerkleTrees; - let prover: BlockProver; // The header of the last block let prevHeader: Header; @@ -157,8 +156,7 @@ describe('L1Publisher integration', () => { }; worldStateSynchronizer = new ServerWorldStateSynchronizer(tmpStore, builderDb, blockSource, worldStateConfig); await worldStateSynchronizer.start(); - builder = await TxProver.new(config, new NoopTelemetryClient()); - prover = builder.createBlockProver(builderDb.asLatest()); + builder = new LightweightBlockBuilder(builderDb.asLatest(), new NoopTelemetryClient()); publisher = new L1Publisher( { @@ -314,9 +312,9 @@ describe('L1Publisher integration', () => { }; const buildBlock = async (globalVariables: GlobalVariables, txs: ProcessedTx[], l1ToL2Messages: Fr[]) => { - const blockTicket = await prover.startNewBlock(txs.length, globalVariables, l1ToL2Messages); + const blockTicket = await builder.startNewBlock(txs.length, globalVariables, l1ToL2Messages); for (const tx of txs) { - await prover.addNewTx(tx); + await builder.addNewTx(tx); } return blockTicket; }; @@ -368,7 +366,7 @@ describe('L1Publisher integration', () => { const ticket = await buildBlock(globalVariables, txs, currentL1ToL2Messages); const result = await ticket.provingPromise; expect(result.status).toBe(PROVING_STATUS.SUCCESS); - const blockResult = await prover.finaliseBlock(); + const blockResult = await builder.finaliseBlock(); const block = blockResult.block; prevHeader = block.header; blockSource.getL1ToL2Messages.mockResolvedValueOnce(currentL1ToL2Messages); @@ -474,10 +472,10 @@ describe('L1Publisher integration', () => { GasFees.empty(), ); const blockTicket = await buildBlock(globalVariables, txs, l1ToL2Messages); - await prover.setBlockCompleted(); + await builder.setBlockCompleted(); const result = await blockTicket.provingPromise; expect(result.status).toBe(PROVING_STATUS.SUCCESS); - const blockResult = await prover.finaliseBlock(); + const blockResult = await builder.finaliseBlock(); const block = blockResult.block; prevHeader = block.header; blockSource.getL1ToL2Messages.mockResolvedValueOnce(l1ToL2Messages); diff --git a/yarn-project/end-to-end/src/composed/integration_proof_verification.test.ts b/yarn-project/end-to-end/src/composed/integration_proof_verification.test.ts index 594272d005bf..402cfe3fe819 100644 --- a/yarn-project/end-to-end/src/composed/integration_proof_verification.test.ts +++ b/yarn-project/end-to-end/src/composed/integration_proof_verification.test.ts @@ -74,7 +74,10 @@ describe('proof_verification', () => { acvmTeardown = acvm!.cleanup; logger.info('bb, acvm done'); - const content = await circuitVerifier.generateSolidityContract('BlockRootRollupArtifact', 'UltraHonkVerifier.sol'); + const content = await circuitVerifier.generateSolidityContract( + 'BlockRootRollupFinalArtifact', + 'UltraHonkVerifier.sol', + ); logger.info('generated contract'); const input = { @@ -137,7 +140,9 @@ describe('proof_verification', () => { describe('bb', () => { it('verifies proof', async () => { - await expect(circuitVerifier.verifyProofForCircuit('BlockRootRollupArtifact', proof)).resolves.toBeUndefined(); + await expect( + circuitVerifier.verifyProofForCircuit('BlockRootRollupFinalArtifact', proof), + ).resolves.toBeUndefined(); }); }); diff --git a/yarn-project/end-to-end/src/e2e_prover/e2e_prover_test.ts b/yarn-project/end-to-end/src/e2e_prover/e2e_prover_test.ts index fe3b7dac7967..89a833ff216c 100644 --- a/yarn-project/end-to-end/src/e2e_prover/e2e_prover_test.ts +++ b/yarn-project/end-to-end/src/e2e_prover/e2e_prover_test.ts @@ -16,7 +16,7 @@ import { createDebugLogger, deployL1Contract, } from '@aztec/aztec.js'; -import { BBCircuitVerifier } from '@aztec/bb-prover'; +import { BBCircuitVerifier, type ClientProtocolCircuitVerifier, TestCircuitVerifier } from '@aztec/bb-prover'; import { RollupAbi } from '@aztec/l1-artifacts'; import { TokenContract } from '@aztec/noir-contracts.js'; import { type ProverNode, type ProverNodeConfig, createProverNode } from '@aztec/prover-node'; @@ -73,14 +73,14 @@ export class FullProverTest { private provenComponents: ProvenSetup[] = []; private bbConfigCleanup?: () => Promise; private acvmConfigCleanup?: () => Promise; - circuitProofVerifier?: BBCircuitVerifier; + circuitProofVerifier?: ClientProtocolCircuitVerifier; provenAssets: TokenContract[] = []; private context!: SubsystemsContext; private proverNode!: ProverNode; private simulatedProverNode!: ProverNode; private l1Contracts!: DeployL1Contracts; - constructor(testName: string, private minNumberOfTxsPerBlock: number) { + constructor(testName: string, private minNumberOfTxsPerBlock: number, private realProofs = true) { this.logger = createDebugLogger(`aztec:full_prover_test:${testName}`); this.snapshotManager = createSnapshotManager(`full_prover_integration/${testName}`, dataPath); } @@ -149,25 +149,31 @@ export class FullProverTest { // Configure a full prover PXE - const [acvmConfig, bbConfig] = await Promise.all([getACVMConfig(this.logger), getBBConfig(this.logger)]); - if (!acvmConfig || !bbConfig) { - throw new Error('Missing ACVM or BB config'); - } + let acvmConfig: Awaited> | undefined; + let bbConfig: Awaited> | undefined; + if (this.realProofs) { + [acvmConfig, bbConfig] = await Promise.all([getACVMConfig(this.logger), getBBConfig(this.logger)]); + if (!acvmConfig || !bbConfig) { + throw new Error('Missing ACVM or BB config'); + } - this.acvmConfigCleanup = acvmConfig.cleanup; - this.bbConfigCleanup = bbConfig.cleanup; + this.acvmConfigCleanup = acvmConfig.cleanup; + this.bbConfigCleanup = bbConfig.cleanup; - if (!bbConfig?.bbWorkingDirectory || !bbConfig?.bbBinaryPath) { - throw new Error(`Test must be run with BB native configuration`); - } + if (!bbConfig?.bbWorkingDirectory || !bbConfig?.bbBinaryPath) { + throw new Error(`Test must be run with BB native configuration`); + } - this.circuitProofVerifier = await BBCircuitVerifier.new(bbConfig); + this.circuitProofVerifier = await BBCircuitVerifier.new(bbConfig); - this.logger.debug(`Configuring the node for real proofs...`); - await this.aztecNode.setConfig({ - realProofs: true, - minTxsPerBlock: this.minNumberOfTxsPerBlock, - }); + this.logger.debug(`Configuring the node for real proofs...`); + await this.aztecNode.setConfig({ + realProofs: true, + minTxsPerBlock: this.minNumberOfTxsPerBlock, + }); + } else { + this.circuitProofVerifier = new TestCircuitVerifier(); + } this.logger.debug(`Main setup completed, initializing full prover PXE, Node, and Prover Node...`); @@ -175,7 +181,7 @@ export class FullProverTest { const result = await setupPXEService( this.aztecNode, { - proverEnabled: true, + proverEnabled: this.realProofs, bbBinaryPath: bbConfig?.bbBinaryPath, bbWorkingDirectory: bbConfig?.bbWorkingDirectory, }, @@ -239,7 +245,7 @@ export class FullProverTest { txProviderNodeUrl: undefined, dataDirectory: undefined, proverId: new Fr(81), - realProofs: true, + realProofs: this.realProofs, proverAgentConcurrency: 2, publisherPrivateKey: `0x${proverNodePrivateKey!.toString('hex')}`, proverNodeMaxPendingJobs: 100, @@ -341,14 +347,18 @@ export class FullProverTest { } async deployVerifier() { + if (!this.realProofs) { + return; + } + if (!this.circuitProofVerifier) { throw new Error('No verifier'); } const { walletClient, publicClient, l1ContractAddresses } = this.context.deployL1ContractsValues; - const contract = await this.circuitProofVerifier.generateSolidityContract( - 'BlockRootRollupArtifact', + const contract = await (this.circuitProofVerifier as BBCircuitVerifier).generateSolidityContract( + 'BlockRootRollupFinalArtifact', 'UltraHonkVerifier.sol', ); diff --git a/yarn-project/end-to-end/src/e2e_prover/full.test.ts b/yarn-project/end-to-end/src/e2e_prover/full.test.ts index 3dfd8d727a0d..f6d52aabb76f 100644 --- a/yarn-project/end-to-end/src/e2e_prover/full.test.ts +++ b/yarn-project/end-to-end/src/e2e_prover/full.test.ts @@ -8,7 +8,8 @@ const TIMEOUT = 1_800_000; process.env.AVM_PROVING_STRICT = '1'; describe('full_prover', () => { - const t = new FullProverTest('full_prover', 2); + const realProofs = !['true', '1'].includes(process.env.FAKE_PROOFS ?? ''); + const t = new FullProverTest('full_prover', 2, realProofs); let { provenAssets, accounts, tokenSim, logger } = t; beforeAll(async () => { @@ -83,6 +84,11 @@ describe('full_prover', () => { ); it('rejects txs with invalid proofs', async () => { + if (!realProofs) { + t.logger.warn(`Skipping test with fake proofs`); + return; + } + const privateInteraction = t.fakeProofsAsset.methods.transfer(accounts[1].address, 1); const publicInteraction = t.fakeProofsAsset.methods.transfer_public(accounts[0].address, accounts[1].address, 1, 0); diff --git a/yarn-project/noir-protocol-circuits-types/src/artifacts.ts b/yarn-project/noir-protocol-circuits-types/src/artifacts.ts index ac31030dff44..bd47118cf1c6 100644 --- a/yarn-project/noir-protocol-circuits-types/src/artifacts.ts +++ b/yarn-project/noir-protocol-circuits-types/src/artifacts.ts @@ -36,6 +36,7 @@ import BaseRollupJson from '../artifacts/rollup_base.json' assert { type: 'json' import BaseRollupSimulatedJson from '../artifacts/rollup_base_simulated.json' assert { type: 'json' }; import BlockMergeRollupJson from '../artifacts/rollup_block_merge.json' assert { type: 'json' }; import BlockRootRollupJson from '../artifacts/rollup_block_root.json' assert { type: 'json' }; +import BlockRootRollupFinalJson from '../artifacts/rollup_block_root_final.json' assert { type: 'json' }; import MergeRollupJson from '../artifacts/rollup_merge.json' assert { type: 'json' }; import RootRollupJson from '../artifacts/rollup_root.json' assert { type: 'json' }; @@ -67,6 +68,7 @@ export type ServerProtocolArtifact = | 'BaseRollupArtifact' | 'MergeRollupArtifact' | 'BlockRootRollupArtifact' + | 'BlockRootRollupFinalArtifact' // TODO(palla/prover): Delete this artifact | 'BlockMergeRollupArtifact' | 'RootRollupArtifact'; @@ -92,6 +94,7 @@ export const ServerCircuitArtifacts: Record = { @@ -107,6 +110,7 @@ export const SimulatedServerCircuitArtifacts: Record = { diff --git a/yarn-project/noir-protocol-circuits-types/src/vks.ts b/yarn-project/noir-protocol-circuits-types/src/vks.ts index d7d1223082eb..46f00bf4f2d1 100644 --- a/yarn-project/noir-protocol-circuits-types/src/vks.ts +++ b/yarn-project/noir-protocol-circuits-types/src/vks.ts @@ -2,6 +2,7 @@ import { BASE_PARITY_INDEX, BASE_ROLLUP_INDEX, BLOCK_MERGE_ROLLUP_INDEX, + BLOCK_ROOT_ROLLUP_FINAL_INDEX, BLOCK_ROOT_ROLLUP_INDEX, EMPTY_NESTED_INDEX, Fr, @@ -52,6 +53,7 @@ import PublicKernelTailVkJson from '../artifacts/keys/public_kernel_tail.vk.data import BaseRollupVkJson from '../artifacts/keys/rollup_base.vk.data.json' assert { type: 'json' }; import BlockMergeRollupVkJson from '../artifacts/keys/rollup_block_merge.vk.data.json' assert { type: 'json' }; import BlockRootRollupVkJson from '../artifacts/keys/rollup_block_root.vk.data.json' assert { type: 'json' }; +import BlockRootRollupFinalVkJson from '../artifacts/keys/rollup_block_root_final.vk.data.json' assert { type: 'json' }; import MergeRollupVkJson from '../artifacts/keys/rollup_merge.vk.data.json' assert { type: 'json' }; import RootRollupVkJson from '../artifacts/keys/rollup_root.vk.data.json' assert { type: 'json' }; import { type ClientProtocolArtifact, type ProtocolArtifact, type ServerProtocolArtifact } from './artifacts.js'; @@ -87,6 +89,7 @@ const ServerCircuitVks: Record = { BaseRollupArtifact: keyJsonToVKData(BaseRollupVkJson), MergeRollupArtifact: keyJsonToVKData(MergeRollupVkJson), BlockRootRollupArtifact: keyJsonToVKData(BlockRootRollupVkJson), + BlockRootRollupFinalArtifact: keyJsonToVKData(BlockRootRollupFinalVkJson), BlockMergeRollupArtifact: keyJsonToVKData(BlockMergeRollupVkJson), RootRollupArtifact: keyJsonToVKData(RootRollupVkJson), }; @@ -132,6 +135,7 @@ export const ProtocolCircuitVkIndexes: Record = { BlockRootRollupArtifact: BLOCK_ROOT_ROLLUP_INDEX, BlockMergeRollupArtifact: BLOCK_MERGE_ROLLUP_INDEX, RootRollupArtifact: ROOT_ROLLUP_INDEX, + BlockRootRollupFinalArtifact: BLOCK_ROOT_ROLLUP_FINAL_INDEX, }; function buildVKTree() { diff --git a/yarn-project/prover-client/src/mocks/fixtures.ts b/yarn-project/prover-client/src/mocks/fixtures.ts index d0421e00455e..effc35c28142 100644 --- a/yarn-project/prover-client/src/mocks/fixtures.ts +++ b/yarn-project/prover-client/src/mocks/fixtures.ts @@ -135,9 +135,9 @@ export const makeGlobals = (blockNumber: number) => { return new GlobalVariables( Fr.ZERO, Fr.ZERO, - new Fr(blockNumber), + new Fr(blockNumber) /** block number */, new Fr(blockNumber) /** slot number */, - Fr.ZERO, + new Fr(blockNumber) /** timestamp */, EthAddress.ZERO, AztecAddress.ZERO, GasFees.empty(), diff --git a/yarn-project/prover-client/src/orchestrator/block-building-helpers.ts b/yarn-project/prover-client/src/orchestrator/block-building-helpers.ts index 9862ce40b22c..dd798f38b359 100644 --- a/yarn-project/prover-client/src/orchestrator/block-building-helpers.ts +++ b/yarn-project/prover-client/src/orchestrator/block-building-helpers.ts @@ -1,4 +1,4 @@ -import { MerkleTreeId, type ProcessedTx, getTreeHeight } from '@aztec/circuit-types'; +import { type Body, MerkleTreeId, type ProcessedTx, TxEffect, getTreeHeight } from '@aztec/circuit-types'; import { ARCHIVE_HEIGHT, AppendOnlyTreeSnapshot, @@ -6,25 +6,24 @@ import { BaseRollupInputs, BlockMergeRollupInputs, type BlockRootOrBlockMergePublicInputs, - BlockRootRollupInputs, ConstantRollupData, ContentCommitment, Fr, type GlobalVariables, Header, KernelData, - type L1_TO_L2_MSG_SUBTREE_SIBLING_PATH_LENGTH, MAX_NULLIFIERS_PER_TX, MAX_TOTAL_PUBLIC_DATA_UPDATE_REQUESTS_PER_TX, MembershipWitness, MergeRollupInputs, + MerkleTreeCalculator, type NESTED_RECURSIVE_PROOF_LENGTH, NOTE_HASH_SUBTREE_HEIGHT, NOTE_HASH_SUBTREE_SIBLING_PATH_LENGTH, NULLIFIER_SUBTREE_HEIGHT, NULLIFIER_SUBTREE_SIBLING_PATH_LENGTH, NULLIFIER_TREE_HEIGHT, - type NUMBER_OF_L1_L2_MESSAGES_PER_ROLLUP, + NUMBER_OF_L1_L2_MESSAGES_PER_ROLLUP, NullifierLeafPreimage, PUBLIC_DATA_SUBTREE_HEIGHT, PUBLIC_DATA_SUBTREE_SIBLING_PATH_LENGTH, @@ -39,7 +38,6 @@ import { PublicDataUpdateRequest, type RECURSIVE_PROOF_LENGTH, type RecursiveProof, - type RootParityInput, RootRollupInputs, StateDiffHints, StateReference, @@ -52,10 +50,13 @@ import { padArrayEnd } from '@aztec/foundation/collection'; import { sha256Trunc } from '@aztec/foundation/crypto'; import { type DebugLogger } from '@aztec/foundation/log'; import { type Tuple, assertLength, toFriendlyJSON } from '@aztec/foundation/serialize'; +import { computeUnbalancedMerkleRoot } from '@aztec/foundation/trees'; import { getVKIndex, getVKSiblingPath, getVKTreeRoot } from '@aztec/noir-protocol-circuits-types'; import { HintsBuilder, computeFeePayerBalanceLeafSlot } from '@aztec/simulator'; import { type MerkleTreeOperations } from '@aztec/world-state'; +import { inspect } from 'util'; + /** * Type representing the names of the trees for the base rollup. */ @@ -193,7 +194,6 @@ export function createMergeRollupInputs( return mergeInputs; } -// TODO(#7346): Integrate batch rollup circuits and test below export function createBlockMergeRollupInputs( left: [ BlockRootOrBlockMergePublicInputs, @@ -217,7 +217,7 @@ export function buildHeaderFromCircuitOutputs( previousMergeData: [BaseOrMergeRollupPublicInputs, BaseOrMergeRollupPublicInputs], parityPublicInputs: ParityPublicInputs, rootRollupOutputs: BlockRootOrBlockMergePublicInputs, - l1ToL2TreeSnapshot: AppendOnlyTreeSnapshot, + updatedL1ToL2TreeSnapshot: AppendOnlyTreeSnapshot, logger?: DebugLogger, ) { const contentCommitment = new ContentCommitment( @@ -228,7 +228,7 @@ export function buildHeaderFromCircuitOutputs( parityPublicInputs.shaRoot.toBuffer(), sha256Trunc(Buffer.concat([previousMergeData[0].outHash.toBuffer(), previousMergeData[1].outHash.toBuffer()])), ); - const state = new StateReference(l1ToL2TreeSnapshot, previousMergeData[1].end); + const state = new StateReference(updatedL1ToL2TreeSnapshot, previousMergeData[1].end); const header = new Header( rootRollupOutputs.previousArchive, contentCommitment, @@ -239,14 +239,54 @@ export function buildHeaderFromCircuitOutputs( if (!header.hash().equals(rootRollupOutputs.endBlockHash)) { logger?.error( `Block header mismatch when building header from circuit outputs.` + - `\n\nBuilt: ${toFriendlyJSON(header)}` + + `\n\nHeader: ${inspect(header)}` + `\n\nCircuit: ${toFriendlyJSON(rootRollupOutputs)}`, ); - throw new Error(`Block header mismatch`); + throw new Error(`Block header mismatch when building from circuit outputs`); } return header; } +export async function buildHeaderFromTxEffects( + body: Body, + globalVariables: GlobalVariables, + l1ToL2Messages: Fr[], + db: MerkleTreeOperations, +) { + const stateReference = new StateReference( + await getTreeSnapshot(MerkleTreeId.L1_TO_L2_MESSAGE_TREE, db), + new PartialStateReference( + await getTreeSnapshot(MerkleTreeId.NOTE_HASH_TREE, db), + await getTreeSnapshot(MerkleTreeId.NULLIFIER_TREE, db), + await getTreeSnapshot(MerkleTreeId.PUBLIC_DATA_TREE, db), + ), + ); + + const previousArchive = await getTreeSnapshot(MerkleTreeId.ARCHIVE, db); + + const outHash = computeUnbalancedMerkleRoot( + body.txEffects.map(tx => tx.txOutHash()), + TxEffect.empty().txOutHash(), + ); + + l1ToL2Messages = padArrayEnd(l1ToL2Messages, Fr.ZERO, NUMBER_OF_L1_L2_MESSAGES_PER_ROLLUP); + const hasher = (left: Buffer, right: Buffer) => sha256Trunc(Buffer.concat([left, right])); + const parityHeight = Math.ceil(Math.log2(NUMBER_OF_L1_L2_MESSAGES_PER_ROLLUP)); + const parityShaRoot = new MerkleTreeCalculator(parityHeight, Fr.ZERO.toBuffer(), hasher).computeTreeRoot( + l1ToL2Messages.map(msg => msg.toBuffer()), + ); + + const contentCommitment = new ContentCommitment( + new Fr(body.numberOfTxsIncludingPadded), + body.getTxsEffectsHash(), + parityShaRoot, + outHash, + ); + + const fees = body.txEffects.reduce((acc, tx) => acc.add(tx.transactionFee), Fr.ZERO); + return new Header(previousArchive, contentCommitment, stateReference, globalVariables, fees); +} + // Validate that the roots of all local trees match the output of the root circuit simulation export async function validateBlockRootOutput( blockRootOutput: BlockRootOrBlockMergePublicInputs, @@ -282,46 +322,7 @@ export async function getRootTreeSiblingPath(treeId: T return padArrayEnd(path.toFields(), Fr.ZERO, getTreeHeight(treeId)); } -// Builds the inputs for the block root rollup circuit, without making any changes to trees -export async function getBlockRootRollupInput( - rollupOutputLeft: BaseOrMergeRollupPublicInputs, - rollupProofLeft: RecursiveProof, - verificationKeyLeft: VerificationKeyAsFields, - rollupOutputRight: BaseOrMergeRollupPublicInputs, - rollupProofRight: RecursiveProof, - verificationKeyRight: VerificationKeyAsFields, - l1ToL2Roots: RootParityInput, - newL1ToL2Messages: Tuple, - messageTreeSnapshot: AppendOnlyTreeSnapshot, - messageTreeRootSiblingPath: Tuple, - db: MerkleTreeOperations, - proverId: Fr, -) { - const previousRollupData: BlockRootRollupInputs['previousRollupData'] = [ - getPreviousRollupDataFromPublicInputs(rollupOutputLeft, rollupProofLeft, verificationKeyLeft), - getPreviousRollupDataFromPublicInputs(rollupOutputRight, rollupProofRight, verificationKeyRight), - ]; - - // Get blocks tree - const startArchiveSnapshot = await getTreeSnapshot(MerkleTreeId.ARCHIVE, db); - const newArchiveSiblingPath = await getRootTreeSiblingPath(MerkleTreeId.ARCHIVE, db); - - return BlockRootRollupInputs.from({ - previousRollupData, - l1ToL2Roots, - newL1ToL2Messages, - newL1ToL2MessageTreeRootSiblingPath: messageTreeRootSiblingPath, - startL1ToL2MessageTreeSnapshot: messageTreeSnapshot, - startArchiveSnapshot, - newArchiveSiblingPath, - // TODO(#7346): Inject previous block hash (required when integrating batch rollup circuits) - previousBlockHash: Fr.ZERO, - proverId, - }); -} - // Builds the inputs for the final root rollup circuit, without making any changes to trees -// TODO(#7346): Integrate batch rollup circuits and test below export function getRootRollupInput( rollupOutputLeft: BlockRootOrBlockMergePublicInputs, rollupProofLeft: RecursiveProof, diff --git a/yarn-project/prover-client/src/orchestrator/proving-state.ts b/yarn-project/prover-client/src/orchestrator/block-proving-state.ts similarity index 86% rename from yarn-project/prover-client/src/orchestrator/proving-state.ts rename to yarn-project/prover-client/src/orchestrator/block-proving-state.ts index 4aaa96abd63d..f3c2519c8311 100644 --- a/yarn-project/prover-client/src/orchestrator/proving-state.ts +++ b/yarn-project/prover-client/src/orchestrator/block-proving-state.ts @@ -1,5 +1,6 @@ import { type L2Block, type MerkleTreeId, type ProvingResult } from '@aztec/circuit-types'; import { + type ARCHIVE_HEIGHT, type AppendOnlyTreeSnapshot, type BaseOrMergeRollupPublicInputs, type BlockRootOrBlockMergePublicInputs, @@ -8,6 +9,7 @@ import { type L1_TO_L2_MSG_SUBTREE_SIBLING_PATH_LENGTH, type NESTED_RECURSIVE_PROOF_LENGTH, type NUMBER_OF_L1_L2_MESSAGES_PER_ROLLUP, + NUM_BASE_PARITY_PER_ROOT_PARITY, type Proof, type RECURSIVE_PROOF_LENGTH, type RecursiveProof, @@ -18,6 +20,12 @@ import { type Tuple } from '@aztec/foundation/serialize'; import { type TxProvingState } from './tx-proving-state.js'; +enum PROVING_STATE_LIFECYCLE { + PROVING_STATE_CREATED, + PROVING_STATE_RESOLVED, + PROVING_STATE_REJECTED, +} + export type MergeRollupInputData = { inputs: [BaseOrMergeRollupPublicInputs | undefined, BaseOrMergeRollupPublicInputs | undefined]; proofs: [ @@ -29,20 +37,11 @@ export type MergeRollupInputData = { export type TreeSnapshots = Map; -enum PROVING_STATE_LIFECYCLE { - PROVING_STATE_CREATED, - PROVING_STATE_FULL, - PROVING_STATE_RESOLVED, - PROVING_STATE_REJECTED, -} - /** - * The current state of the proving schedule. Contains the raw inputs (txs) and intermediate state to generate every constituent proof in the tree. - * Carries an identifier so we can identify if the proving state is discarded and a new one started. - * Captures resolve and reject callbacks to provide a promise base interface to the consumer of our proving. + * The current state of the proving schedule for a given block. Managed by ProvingState. + * Contains the raw inputs and intermediate state to generate every constituent proof in the tree. */ -export class ProvingState { - private provingStateLifecycle = PROVING_STATE_LIFECYCLE.PROVING_STATE_CREATED; +export class BlockProvingState { private mergeRollupInputs: MergeRollupInputData[] = []; private rootParityInputs: Array | undefined> = []; private finalRootParityInputs: RootParityInput | undefined; @@ -50,17 +49,28 @@ export class ProvingState { public finalProof: Proof | undefined; public block: L2Block | undefined; private txs: TxProvingState[] = []; + + private provingStateLifecycle = PROVING_STATE_LIFECYCLE.PROVING_STATE_CREATED; + constructor( + public readonly index: number, public readonly totalNumTxs: number, - private completionCallback: (result: ProvingResult) => void, - private rejectionCallback: (reason: string) => void, public readonly globalVariables: GlobalVariables, public readonly newL1ToL2Messages: Tuple, - numRootParityInputs: number, public readonly messageTreeSnapshot: AppendOnlyTreeSnapshot, public readonly messageTreeRootSiblingPath: Tuple, + public readonly messageTreeSnapshotAfterInsertion: AppendOnlyTreeSnapshot, + public readonly archiveTreeSnapshot: AppendOnlyTreeSnapshot, + public readonly archiveTreeRootSiblingPath: Tuple, + public readonly previousBlockHash: Fr, + private completionCallback?: (result: ProvingResult) => void, + private rejectionCallback?: (reason: string) => void, ) { - this.rootParityInputs = Array.from({ length: numRootParityInputs }).map(_ => undefined); + this.rootParityInputs = Array.from({ length: NUM_BASE_PARITY_PER_ROOT_PARITY }).map(_ => undefined); + } + + public get blockNumber() { + return this.globalVariables.blockNumber.toNumber(); } // Returns the number of levels of merge rollups @@ -95,12 +105,8 @@ export class ProvingState { } // Adds a transaction to the proving state, returns it's index - // Will update the proving life cycle if this is the last transaction public addNewTx(tx: TxProvingState) { this.txs.push(tx); - if (this.txs.length === this.totalNumTxs) { - this.provingStateLifecycle = PROVING_STATE_LIFECYCLE.PROVING_STATE_FULL; - } return this.txs.length - 1; } @@ -124,19 +130,6 @@ export class ProvingState { return this.rootParityInputs; } - // Returns true if this proving state is still valid, false otherwise - public verifyState() { - return ( - this.provingStateLifecycle === PROVING_STATE_LIFECYCLE.PROVING_STATE_CREATED || - this.provingStateLifecycle === PROVING_STATE_LIFECYCLE.PROVING_STATE_FULL - ); - } - - // Returns true if we are still able to accept transactions, false otherwise - public isAcceptingTransactions() { - return this.provingStateLifecycle === PROVING_STATE_LIFECYCLE.PROVING_STATE_CREATED; - } - // Returns the complete set of transaction proving state objects public get allTxs() { return this.txs; @@ -211,28 +204,37 @@ export class ProvingState { return this.rootParityInputs.findIndex(p => !p) === -1; } - // Attempts to reject the proving state promise with a reason of 'cancelled' - public cancel() { - this.reject('Proving cancelled'); + // Returns true if we are still able to accept transactions, false otherwise + public isAcceptingTransactions() { + return ( + this.provingStateLifecycle === PROVING_STATE_LIFECYCLE.PROVING_STATE_CREATED && this.totalNumTxs > this.txs.length + ); + } + + // Returns true if this proving state is still valid, false otherwise + public verifyState() { + return this.provingStateLifecycle === PROVING_STATE_LIFECYCLE.PROVING_STATE_CREATED; } // Attempts to reject the proving state promise with the given reason - // Does nothing if not in a valid state public reject(reason: string) { if (!this.verifyState()) { return; } this.provingStateLifecycle = PROVING_STATE_LIFECYCLE.PROVING_STATE_REJECTED; - this.rejectionCallback(reason); + if (this.rejectionCallback) { + this.rejectionCallback(reason); + } } // Attempts to resolve the proving state promise with the given result - // Does nothing if not in a valid state public resolve(result: ProvingResult) { if (!this.verifyState()) { return; } this.provingStateLifecycle = PROVING_STATE_LIFECYCLE.PROVING_STATE_RESOLVED; - this.completionCallback(result); + if (this.completionCallback) { + this.completionCallback(result); + } } } diff --git a/yarn-project/prover-client/src/orchestrator/epoch-proving-state.ts b/yarn-project/prover-client/src/orchestrator/epoch-proving-state.ts new file mode 100644 index 000000000000..350240242b8a --- /dev/null +++ b/yarn-project/prover-client/src/orchestrator/epoch-proving-state.ts @@ -0,0 +1,232 @@ +import { type MerkleTreeId, type ProvingResult } from '@aztec/circuit-types'; +import { + type ARCHIVE_HEIGHT, + type AppendOnlyTreeSnapshot, + type BlockRootOrBlockMergePublicInputs, + Fr, + type GlobalVariables, + type L1_TO_L2_MSG_SUBTREE_SIBLING_PATH_LENGTH, + type NESTED_RECURSIVE_PROOF_LENGTH, + NUMBER_OF_L1_L2_MESSAGES_PER_ROLLUP, + type Proof, + type RecursiveProof, + type RootRollupPublicInputs, + type VerificationKeyAsFields, +} from '@aztec/circuits.js'; +import { padArrayEnd } from '@aztec/foundation/collection'; +import { type Tuple } from '@aztec/foundation/serialize'; + +import { BlockProvingState } from './block-proving-state.js'; + +export type TreeSnapshots = Map; + +enum PROVING_STATE_LIFECYCLE { + PROVING_STATE_CREATED, + PROVING_STATE_FULL, + PROVING_STATE_RESOLVED, + PROVING_STATE_REJECTED, +} + +export type BlockMergeRollupInputData = { + inputs: [BlockRootOrBlockMergePublicInputs | undefined, BlockRootOrBlockMergePublicInputs | undefined]; + proofs: [ + RecursiveProof | undefined, + RecursiveProof | undefined, + ]; + verificationKeys: [VerificationKeyAsFields | undefined, VerificationKeyAsFields | undefined]; +}; + +/** + * The current state of the proving schedule for an epoch. + * Contains the raw inputs and intermediate state to generate every constituent proof in the tree. + * Carries an identifier so we can identify if the proving state is discarded and a new one started. + * Captures resolve and reject callbacks to provide a promise base interface to the consumer of our proving. + */ +export class EpochProvingState { + private provingStateLifecycle = PROVING_STATE_LIFECYCLE.PROVING_STATE_CREATED; + + private mergeRollupInputs: BlockMergeRollupInputData[] = []; + public rootRollupPublicInputs: RootRollupPublicInputs | undefined; + public finalProof: Proof | undefined; + public blocks: BlockProvingState[] = []; + + constructor( + public readonly epochNumber: number, + public readonly totalNumBlocks: number, + private completionCallback: (result: ProvingResult) => void, + private rejectionCallback: (reason: string) => void, + ) {} + + /** Returns the current block proving state */ + public get currentBlock(): BlockProvingState | undefined { + return this.blocks[this.blocks.length - 1]; + } + + // Returns the number of levels of merge rollups + public get numMergeLevels() { + return BigInt(Math.ceil(Math.log2(this.totalNumBlocks)) - 1); + } + + // Calculates the index and level of the parent rollup circuit + // Based on tree implementation in unbalanced_tree.ts -> batchInsert() + // REFACTOR: This is repeated from the block orchestrator + public findMergeLevel(currentLevel: bigint, currentIndex: bigint) { + const moveUpMergeLevel = (levelSize: number, index: bigint, nodeToShift: boolean) => { + levelSize /= 2; + if (levelSize & 1) { + [levelSize, nodeToShift] = nodeToShift ? [levelSize + 1, false] : [levelSize - 1, true]; + } + index >>= 1n; + return { thisLevelSize: levelSize, thisIndex: index, shiftUp: nodeToShift }; + }; + let [thisLevelSize, shiftUp] = + this.totalNumBlocks & 1 ? [this.totalNumBlocks - 1, true] : [this.totalNumBlocks, false]; + const maxLevel = this.numMergeLevels + 1n; + let placeholder = currentIndex; + for (let i = 0; i < maxLevel - currentLevel; i++) { + ({ thisLevelSize, thisIndex: placeholder, shiftUp } = moveUpMergeLevel(thisLevelSize, placeholder, shiftUp)); + } + let thisIndex = currentIndex; + let mergeLevel = currentLevel; + while (thisIndex >= thisLevelSize && mergeLevel != 0n) { + mergeLevel -= 1n; + ({ thisLevelSize, thisIndex, shiftUp } = moveUpMergeLevel(thisLevelSize, thisIndex, shiftUp)); + } + return [mergeLevel - 1n, thisIndex >> 1n, thisIndex & 1n]; + } + + // Adds a block to the proving state, returns its index + // Will update the proving life cycle if this is the last block + public startNewBlock( + numTxs: number, + globalVariables: GlobalVariables, + l1ToL2Messages: Fr[], + messageTreeSnapshot: AppendOnlyTreeSnapshot, + messageTreeRootSiblingPath: Tuple, + messageTreeSnapshotAfterInsertion: AppendOnlyTreeSnapshot, + archiveTreeSnapshot: AppendOnlyTreeSnapshot, + archiveTreeRootSiblingPath: Tuple, + previousBlockHash: Fr, + completionCallback?: (result: ProvingResult) => void, + rejectionCallback?: (reason: string) => void, + ) { + const block = new BlockProvingState( + this.blocks.length, + numTxs, + globalVariables, + padArrayEnd(l1ToL2Messages, Fr.ZERO, NUMBER_OF_L1_L2_MESSAGES_PER_ROLLUP), + messageTreeSnapshot, + messageTreeRootSiblingPath, + messageTreeSnapshotAfterInsertion, + archiveTreeSnapshot, + archiveTreeRootSiblingPath, + previousBlockHash, + completionCallback, + reason => { + // Reject the block + if (rejectionCallback) { + rejectionCallback(reason); + } + // An error on any block rejects this whole epoch + this.reject(reason); + }, + ); + this.blocks.push(block); + if (this.blocks.length === this.totalNumBlocks) { + this.provingStateLifecycle = PROVING_STATE_LIFECYCLE.PROVING_STATE_FULL; + } + return this.blocks.length - 1; + } + + // Returns true if this proving state is still valid, false otherwise + public verifyState() { + return ( + this.provingStateLifecycle === PROVING_STATE_LIFECYCLE.PROVING_STATE_CREATED || + this.provingStateLifecycle === PROVING_STATE_LIFECYCLE.PROVING_STATE_FULL + ); + } + + // Returns true if we are still able to accept blocks, false otherwise + public isAcceptingBlocks() { + return this.provingStateLifecycle === PROVING_STATE_LIFECYCLE.PROVING_STATE_CREATED; + } + + /** + * Stores the inputs to a merge circuit and determines if the circuit is ready to be executed + * @param mergeInputs - The inputs to store + * @param indexWithinMerge - The index in the set of inputs to this merge circuit + * @param indexOfMerge - The global index of this merge circuit + * @returns True if the merge circuit is ready to be executed, false otherwise + */ + public storeMergeInputs( + mergeInputs: [ + BlockRootOrBlockMergePublicInputs, + RecursiveProof, + VerificationKeyAsFields, + ], + indexWithinMerge: number, + indexOfMerge: number, + ) { + if (!this.mergeRollupInputs[indexOfMerge]) { + const mergeInputData: BlockMergeRollupInputData = { + inputs: [undefined, undefined], + proofs: [undefined, undefined], + verificationKeys: [undefined, undefined], + }; + mergeInputData.inputs[indexWithinMerge] = mergeInputs[0]; + mergeInputData.proofs[indexWithinMerge] = mergeInputs[1]; + mergeInputData.verificationKeys[indexWithinMerge] = mergeInputs[2]; + this.mergeRollupInputs[indexOfMerge] = mergeInputData; + return false; + } + const mergeInputData = this.mergeRollupInputs[indexOfMerge]; + mergeInputData.inputs[indexWithinMerge] = mergeInputs[0]; + mergeInputData.proofs[indexWithinMerge] = mergeInputs[1]; + mergeInputData.verificationKeys[indexWithinMerge] = mergeInputs[2]; + return true; + } + + // Returns a specific transaction proving state + public getBlockProvingState(index: number) { + return this.blocks[index]; + } + + // Returns a set of merge rollup inputs + public getMergeInputs(indexOfMerge: number) { + return this.mergeRollupInputs[indexOfMerge]; + } + + // Returns true if we have sufficient inputs to execute the block root rollup + public isReadyForRootRollup() { + return !(this.mergeRollupInputs[0] === undefined || this.mergeRollupInputs[0].inputs.findIndex(p => !p) !== -1); + } + + // Attempts to reject the proving state promise with a reason of 'cancelled' + public cancel() { + this.reject('Proving cancelled'); + } + + // Attempts to reject the proving state promise with the given reason + // Does nothing if not in a valid state + public reject(reason: string) { + if (!this.verifyState()) { + return; + } + this.provingStateLifecycle = PROVING_STATE_LIFECYCLE.PROVING_STATE_REJECTED; + this.rejectionCallback(reason); + + for (const block of this.blocks) { + block.reject('Proving cancelled'); + } + } + + // Attempts to resolve the proving state promise with the given result + // Does nothing if not in a valid state + public resolve(result: ProvingResult) { + if (!this.verifyState()) { + return; + } + this.provingStateLifecycle = PROVING_STATE_LIFECYCLE.PROVING_STATE_RESOLVED; + this.completionCallback(result); + } +} diff --git a/yarn-project/prover-client/src/orchestrator/orchestrator.ts b/yarn-project/prover-client/src/orchestrator/orchestrator.ts index aec2271a83a9..63cf099f51ae 100644 --- a/yarn-project/prover-client/src/orchestrator/orchestrator.ts +++ b/yarn-project/prover-client/src/orchestrator/orchestrator.ts @@ -1,6 +1,5 @@ import { BlockProofError, - type BlockProver, Body, EncryptedNoteTxL2Logs, EncryptedTxL2Logs, @@ -23,12 +22,15 @@ import { mapProvingRequestTypeToCircuitName, toTxEffect, } from '@aztec/circuit-types'; +import { type EpochProver } from '@aztec/circuit-types/interfaces'; import { type CircuitName } from '@aztec/circuit-types/stats'; import { AvmCircuitInputs, type BaseOrMergeRollupPublicInputs, BaseParityInputs, type BaseRollupInputs, + type BlockRootOrBlockMergePublicInputs, + BlockRootRollupInputs, Fr, type GlobalVariables, type KernelCircuitPublicInputs, @@ -69,16 +71,20 @@ import { inspect } from 'util'; import { buildBaseRollupInput, buildHeaderFromCircuitOutputs, + buildHeaderFromTxEffects, + createBlockMergeRollupInputs, createMergeRollupInputs, - getBlockRootRollupInput, + getPreviousRollupDataFromPublicInputs, + getRootRollupInput, + getRootTreeSiblingPath, getSubtreeSiblingPath, getTreeSnapshot, - validateBlockRootOutput, validatePartialState, validateTx, } from './block-building-helpers.js'; +import { type BlockProvingState, type MergeRollupInputData } from './block-proving-state.js'; +import { type BlockMergeRollupInputData, EpochProvingState, type TreeSnapshots } from './epoch-proving-state.js'; import { ProvingOrchestratorMetrics } from './orchestrator_metrics.js'; -import { type MergeRollupInputData, ProvingState, type TreeSnapshots } from './proving-state.js'; import { TX_PROVING_CODE, type TxProvingInstruction, TxProvingState } from './tx-proving-state.js'; const logger = createDebugLogger('aztec:prover:proving-orchestrator'); @@ -97,8 +103,8 @@ const logger = createDebugLogger('aztec:prover:proving-orchestrator'); /** * The orchestrator, managing the flow of recursive proving operations required to build the rollup proof tree. */ -export class ProvingOrchestrator implements BlockProver { - private provingState: ProvingState | undefined = undefined; +export class ProvingOrchestrator implements EpochProver { + private provingState: EpochProvingState | undefined = undefined; private pendingProvingJobs: AbortController[] = []; private paddingTx: PaddingProcessedTx | undefined = undefined; @@ -128,6 +134,23 @@ export class ProvingOrchestrator implements BlockProver { this.paddingTx = undefined; } + @trackSpan('ProvingOrchestrator.startNewEpoch', (epochNumber, totalNumBlocks) => ({ + [Attributes.EPOCH_SIZE]: totalNumBlocks, + [Attributes.EPOCH_NUMBER]: epochNumber, + })) + public startNewEpoch(epochNumber: number, totalNumBlocks: number): ProvingTicket { + const { promise: _promise, resolve, reject } = promiseWithResolvers(); + const promise = _promise.catch( + (reason): ProvingResult => ({ + status: PROVING_STATUS.FAILURE, + reason, + }), + ); + + this.provingState = new EpochProvingState(epochNumber, totalNumBlocks, resolve, reject); + return { provingPromise: promise }; + } + /** * Starts off a new block * @param numTxs - The total number of transactions in the block. Must be a power of 2 @@ -145,6 +168,15 @@ export class ProvingOrchestrator implements BlockProver { globalVariables: GlobalVariables, l1ToL2Messages: Fr[], ): Promise { + // If no proving state, assume we only care about proving this block and initialize a 1-block epoch + if (!this.provingState) { + this.startNewEpoch(globalVariables.blockNumber.toNumber(), 1); + } + + if (!this.provingState?.isAcceptingBlocks()) { + throw new Error(`Epoch not accepting further blocks`); + } + if (!Number.isInteger(numTxs) || numTxs < 2) { throw new Error(`Length of txs for the block should be at least two (got ${numTxs})`); } @@ -159,11 +191,10 @@ export class ProvingOrchestrator implements BlockProver { ); } - // Cancel any currently proving block before starting a new one - this.cancelBlock(); logger.info( `Starting block ${globalVariables.blockNumber} for slot ${globalVariables.slotNumber} with ${numTxs} transactions`, ); + // we start the block by enqueueing all of the base parity circuits let baseParityInputs: BaseParityInputs[] = []; let l1ToL2MessagesPadded: Tuple; @@ -193,36 +224,39 @@ export class ProvingOrchestrator implements BlockProver { // Update the local trees to include the new l1 to l2 messages await this.db.appendLeaves(MerkleTreeId.L1_TO_L2_MESSAGE_TREE, l1ToL2MessagesPadded); + const messageTreeSnapshotAfterInsertion = await getTreeSnapshot(MerkleTreeId.L1_TO_L2_MESSAGE_TREE, this.db); + + // Get archive snapshot before this block lands + const startArchiveSnapshot = await getTreeSnapshot(MerkleTreeId.ARCHIVE, this.db); + const newArchiveSiblingPath = await getRootTreeSiblingPath(MerkleTreeId.ARCHIVE, this.db); + const previousBlockHash = await this.db.getLeafValue( + MerkleTreeId.ARCHIVE, + BigInt(startArchiveSnapshot.nextAvailableLeafIndex - 1), + ); const { promise: _promise, resolve, reject } = promiseWithResolvers(); - const promise = _promise.catch( - (reason): ProvingResult => ({ - status: PROVING_STATUS.FAILURE, - reason, - }), - ); + const promise = _promise.catch((reason): ProvingResult => ({ status: PROVING_STATUS.FAILURE, reason })); - const provingState = new ProvingState( + this.provingState!.startNewBlock( numTxs, - resolve, - reject, globalVariables, l1ToL2MessagesPadded, - baseParityInputs.length, messageTreeSnapshot, newL1ToL2MessageTreeRootSiblingPath, + messageTreeSnapshotAfterInsertion, + startArchiveSnapshot, + newArchiveSiblingPath, + previousBlockHash!, + resolve, + reject, ); + // Enqueue base parity circuits for the block for (let i = 0; i < baseParityInputs.length; i++) { - this.enqueueBaseParityCircuit(provingState, baseParityInputs[i], i); + this.enqueueBaseParityCircuit(this.provingState!.currentBlock!, baseParityInputs[i], i); } - this.provingState = provingState; - - const ticket: ProvingTicket = { - provingPromise: promise, - }; - return ticket; + return { provingPromise: promise }; } /** @@ -233,11 +267,12 @@ export class ProvingOrchestrator implements BlockProver { [Attributes.TX_HASH]: tx.hash.toString(), })) public async addNewTx(tx: ProcessedTx): Promise { - if (!this.provingState) { + const provingState = this?.provingState?.currentBlock; + if (!provingState) { throw new Error(`Invalid proving state, call startNewBlock before adding transactions`); } - if (!this.provingState.isAcceptingTransactions()) { + if (!provingState.isAcceptingTransactions()) { throw new Error(`Rollup not accepting further transactions`); } @@ -250,34 +285,43 @@ export class ProvingOrchestrator implements BlockProver { return; } - const [inputs, treeSnapshots] = await this.prepareTransaction(tx, this.provingState); - this.enqueueFirstProofs(inputs, treeSnapshots, tx, this.provingState); + const [inputs, treeSnapshots] = await this.prepareTransaction(tx, provingState); + this.enqueueFirstProofs(inputs, treeSnapshots, tx, provingState); + + if (provingState.transactionsReceived === provingState.totalNumTxs) { + logger.verbose( + `All transactions received for block ${provingState.globalVariables.blockNumber}. Assembling header.`, + ); + await this.buildBlockHeader(provingState); + } } /** * Marks the block as full and pads it if required, no more transactions will be accepted. + * Computes the block header and updates the archive tree. */ @trackSpan('ProvingOrchestrator.setBlockCompleted', function () { - if (!this.provingState) { + const block = this.provingState?.currentBlock; + if (!block) { return {}; } - return { - [Attributes.BLOCK_NUMBER]: this.provingState!.globalVariables.blockNumber.toNumber(), - [Attributes.BLOCK_SIZE]: this.provingState!.totalNumTxs, - [Attributes.BLOCK_TXS_COUNT]: this.provingState!.transactionsReceived, + [Attributes.BLOCK_NUMBER]: block.globalVariables.blockNumber.toNumber(), + [Attributes.BLOCK_SIZE]: block.totalNumTxs, + [Attributes.BLOCK_TXS_COUNT]: block.transactionsReceived, }; }) public async setBlockCompleted() { - if (!this.provingState) { + const provingState = this.provingState?.currentBlock; + if (!provingState) { throw new Error(`Invalid proving state, call startNewBlock before adding transactions or completing the block`); } // we may need to pad the rollup with empty transactions - const paddingTxCount = this.provingState.totalNumTxs - this.provingState.transactionsReceived; + const paddingTxCount = provingState.totalNumTxs - provingState.transactionsReceived; if (paddingTxCount === 0) { return; - } else if (this.provingState.totalNumTxs > 2) { + } else if (provingState.totalNumTxs > 2) { throw new Error(`Block not ready for completion: expecting ${paddingTxCount} more transactions.`); } @@ -291,13 +335,13 @@ export class ProvingOrchestrator implements BlockProver { // Then enqueue the proving of all the transactions const unprovenPaddingTx = makeEmptyProcessedTx( this.db.getInitialHeader(), - this.provingState.globalVariables.chainId, - this.provingState.globalVariables.version, + provingState.globalVariables.chainId, + provingState.globalVariables.version, getVKTreeRoot(), ); const txInputs: Array<{ inputs: BaseRollupInputs; snapshot: TreeSnapshots }> = []; for (let i = 0; i < paddingTxCount; i++) { - const [inputs, snapshot] = await this.prepareTransaction(unprovenPaddingTx, this.provingState); + const [inputs, snapshot] = await this.prepareTransaction(unprovenPaddingTx, provingState); const txInput = { inputs, snapshot, @@ -306,13 +350,53 @@ export class ProvingOrchestrator implements BlockProver { } // Now enqueue the proving - this.enqueuePaddingTxs(this.provingState, txInputs, unprovenPaddingTx); + this.enqueuePaddingTxs(provingState, txInputs, unprovenPaddingTx); + + // And build the block header + logger.verbose(`Block ${provingState.globalVariables.blockNumber} padded with empty tx(s). Assembling header.`); + await this.buildBlockHeader(provingState); + } + + private async buildBlockHeader(provingState: BlockProvingState) { + // Collect all new nullifiers, commitments, and contracts from all txs in this block to build body + const gasFees = provingState.globalVariables.gasFees; + const nonEmptyTxEffects: TxEffect[] = provingState!.allTxs + .map(txProvingState => toTxEffect(txProvingState.processedTx, gasFees)) + .filter(txEffect => !txEffect.isEmpty()); + const body = new Body(nonEmptyTxEffects); + + // Given we've applied every change from this block, now assemble the block header + // and update the archive tree, so we're ready to start processing the next block + const header = await buildHeaderFromTxEffects( + body, + provingState.globalVariables, + provingState.newL1ToL2Messages, + this.db, + ); + + logger.verbose(`Updating archive tree with block ${provingState.blockNumber} header ${header.hash().toString()}`); + await this.db.updateArchive(header); + + // Assemble the L2 block + const newArchive = await getTreeSnapshot(MerkleTreeId.ARCHIVE, this.db); + const l2Block = L2Block.fromFields({ archive: newArchive, header, body }); + + if (!l2Block.body.getTxsEffectsHash().equals(header.contentCommitment.txsEffectsHash)) { + throw new Error( + `Txs effects hash mismatch, ${l2Block.body + .getTxsEffectsHash() + .toString('hex')} == ${header.contentCommitment.txsEffectsHash.toString('hex')} `, + ); + } + + logger.verbose(`Orchestrator finalised block ${l2Block.number}`); + provingState.block = l2Block; } // Enqueues the proving of the required padding transactions // If the fully proven padding transaction is not available, this will first be proven private enqueuePaddingTxs( - provingState: ProvingState, + provingState: BlockProvingState, txInputs: Array<{ inputs: BaseRollupInputs; snapshot: TreeSnapshots }>, unprovenPaddingTx: ProcessedTx, ) { @@ -330,7 +414,7 @@ export class ProvingOrchestrator implements BlockProver { 'ProvingOrchestrator.prover.getEmptyPrivateKernelProof', { [Attributes.PROTOCOL_CIRCUIT_TYPE]: 'server', - [Attributes.PROTOCOL_CIRCUIT_NAME]: 'private-kernel-empty' as CircuitName, + [Attributes.PROTOCOL_CIRCUIT_NAME]: 'private-kernel-empty' satisfies CircuitName, }, signal => this.prover.getEmptyPrivateKernelProof( @@ -364,7 +448,7 @@ export class ProvingOrchestrator implements BlockProver { private provePaddingTransactions( txInputs: Array<{ inputs: BaseRollupInputs; snapshot: TreeSnapshots }>, paddingTx: PaddingProcessedTx, - provingState: ProvingState, + provingState: BlockProvingState, ) { // The padding tx contains the proof and vk, generated separately from the base inputs // Copy these into the base rollup inputs and enqueue the base rollup proof @@ -381,9 +465,9 @@ export class ProvingOrchestrator implements BlockProver { } /** - * Cancel any further proving of the block + * Cancel any further proving */ - public cancelBlock() { + public cancel() { for (const controller of this.pendingProvingJobs) { controller.abort(); } @@ -393,118 +477,83 @@ export class ProvingOrchestrator implements BlockProver { /** * Extract the block header from public inputs. - * TODO(#7346): Refactor this once new batch rollup circuits are integrated * @returns The header of this proving state's block. */ - private async extractBlockHeader() { - if ( - !this.provingState || - !this.provingState.blockRootRollupPublicInputs || - !this.provingState.finalRootParityInput?.publicInputs.shaRoot - ) { - throw new Error(`Invalid proving state, a block must be proven before its header can be extracted.`); - } - - const rootRollupOutputs = this.provingState.blockRootRollupPublicInputs; - const previousMergeData = this.provingState.getMergeInputs(0).inputs; + private extractBlockHeaderFromPublicInputs( + provingState: BlockProvingState, + rootRollupOutputs: BlockRootOrBlockMergePublicInputs, + ) { + const previousMergeData = provingState.getMergeInputs(0).inputs; if (!previousMergeData[0] || !previousMergeData[1]) { throw new Error(`Invalid proving state, final merge inputs before block root circuit missing.`); } - const l1ToL2TreeSnapshot = await getTreeSnapshot(MerkleTreeId.L1_TO_L2_MESSAGE_TREE, this.db); - return buildHeaderFromCircuitOutputs( [previousMergeData[0], previousMergeData[1]], - this.provingState.finalRootParityInput.publicInputs, + provingState.finalRootParityInput!.publicInputs, rootRollupOutputs, - l1ToL2TreeSnapshot, + provingState.messageTreeSnapshotAfterInsertion, + logger, ); } /** - * Performs the final tree update for the block and returns the fully proven block. + * Returns the fully proven block. Requires proving to have been completed. + * @param index - The index of the block to finalise. Defaults to the last block. * @returns The fully proven block and proof. */ - @trackSpan('ProvingOrchestrator.finaliseBlock', function () { - return { - [Attributes.BLOCK_NUMBER]: this.provingState!.globalVariables.blockNumber.toNumber(), - [Attributes.BLOCK_TXS_COUNT]: this.provingState!.transactionsReceived, - [Attributes.BLOCK_SIZE]: this.provingState!.totalNumTxs, - }; - }) - public async finaliseBlock() { + public finaliseBlock(index?: number) { try { - if (!this.provingState || !this.provingState.blockRootRollupPublicInputs || !this.provingState.finalProof) { - throw new Error(`Invalid proving state, a block must be proven before it can be finalised`); - } - if (this.provingState.block) { - throw new Error('Block already finalised'); - } - - const rootRollupOutputs = this.provingState.blockRootRollupPublicInputs; - const header = await this.extractBlockHeader(); - - logger?.debug(`Updating and validating root trees`); - await this.db.updateArchive(header); + const block = this.provingState?.blocks[index ?? this.provingState?.blocks.length - 1]; - await validateBlockRootOutput(rootRollupOutputs, header, this.db); - - // Collect all new nullifiers, commitments, and contracts from all txs in this block - const gasFees = this.provingState.globalVariables.gasFees; - const nonEmptyTxEffects: TxEffect[] = this.provingState!.allTxs.map(txProvingState => - toTxEffect(txProvingState.processedTx, gasFees), - ).filter(txEffect => !txEffect.isEmpty()); - const blockBody = new Body(nonEmptyTxEffects); - - const l2Block = L2Block.fromFields({ - archive: rootRollupOutputs.newArchive, - header: header, - body: blockBody, - }); - - if (!l2Block.body.getTxsEffectsHash().equals(header.contentCommitment.txsEffectsHash)) { - logger.debug(inspect(blockBody)); - throw new Error( - `Txs effects hash mismatch, ${l2Block.body - .getTxsEffectsHash() - .toString('hex')} == ${header.contentCommitment.txsEffectsHash.toString('hex')} `, - ); + if (!block || !block.blockRootRollupPublicInputs || !block.finalProof || !block.block) { + throw new Error(`Invalid proving state, a block must be proven before it can be finalised`); } - logger.info(`Orchestrator finalised block ${l2Block.number}`); - - this.provingState.block = l2Block; - const blockResult: ProvingBlockResult = { - proof: this.provingState.finalProof, - aggregationObject: this.provingState.finalProof.extractAggregationObject(), - block: l2Block, + proof: block.finalProof, + aggregationObject: block.finalProof.extractAggregationObject(), + block: block.block!, }; pushTestData('blockResults', { proverId: this.proverId.toString(), vkTreeRoot: getVKTreeRoot().toString(), - block: l2Block.toString(), - proof: this.provingState.finalProof.toString(), + block: blockResult.block.toString(), + proof: blockResult.proof.toString(), aggregationObject: blockResult.aggregationObject.map(x => x.toString()), }); - return blockResult; + return Promise.resolve(blockResult); } catch (err) { throw new BlockProofError( err && typeof err === 'object' && 'message' in err ? String(err.message) : String(err), - this.provingState?.allTxs.map(x => Tx.getHash(x.processedTx)) ?? [], + this.provingState?.blocks[index ?? this.provingState?.blocks.length - 1]?.allTxs.map(x => + Tx.getHash(x.processedTx), + ) ?? [], ); } } + /** + * Returns the proof for the current epoch. + * Requires proving to have been completed. + */ + public finaliseEpoch() { + if (!this.provingState || !this.provingState.rootRollupPublicInputs || !this.provingState.finalProof) { + throw new Error(`Invalid proving state, an epoch must be proven before it can be finalised`); + } + + return { proof: this.provingState.finalProof, publicInputs: this.provingState.rootRollupPublicInputs }; + } + /** * Starts the proving process for the given transaction and adds it to our state * @param tx - The transaction whose proving we wish to commence * @param provingState - The proving state being worked on */ - private async prepareTransaction(tx: ProcessedTx, provingState: ProvingState) { + private async prepareTransaction(tx: ProcessedTx, provingState: BlockProvingState) { const txInputs = await this.prepareBaseRollupInputs(provingState, tx); if (!txInputs) { // This should not be possible @@ -517,7 +566,7 @@ export class ProvingOrchestrator implements BlockProver { inputs: BaseRollupInputs, treeSnapshots: TreeSnapshots, tx: ProcessedTx, - provingState: ProvingState, + provingState: BlockProvingState, ) { const txProvingState = new TxProvingState(tx, inputs, treeSnapshots); const txIndex = provingState.addNewTx(txProvingState); @@ -538,7 +587,7 @@ export class ProvingOrchestrator implements BlockProver { * @param job - The actual job, returns a promise notifying of the job's completion */ private deferredProving( - provingState: ProvingState | undefined, + provingState: EpochProvingState | BlockProvingState | undefined, request: (signal: AbortSignal) => Promise, callback: (result: T) => void | Promise, ) { @@ -598,7 +647,7 @@ export class ProvingOrchestrator implements BlockProver { [Attributes.TX_HASH]: tx.hash.toString(), })) private async prepareBaseRollupInputs( - provingState: ProvingState | undefined, + provingState: BlockProvingState | undefined, tx: ProcessedTx, ): Promise<[BaseRollupInputs, TreeSnapshots] | undefined> { if (!provingState?.verifyState()) { @@ -636,34 +685,9 @@ export class ProvingOrchestrator implements BlockProver { return [inputs, treeSnapshots]; } - // Stores the intermediate inputs prepared for a merge proof - private storeMergeInputs( - provingState: ProvingState, - currentLevel: bigint, - currentIndex: bigint, - mergeInputs: [ - BaseOrMergeRollupPublicInputs, - RecursiveProof, - VerificationKeyAsFields, - ], - ) { - const [mergeLevel, indexWithinMergeLevel, indexWithinMerge] = provingState.findMergeLevel( - currentLevel, - currentIndex, - ); - const mergeIndex = 2n ** mergeLevel - 1n + indexWithinMergeLevel; - const ready = provingState.storeMergeInputs(mergeInputs, Number(indexWithinMerge), Number(mergeIndex)); - return { - ready, - indexWithinMergeLevel, - mergeLevel, - mergeInputData: provingState.getMergeInputs(Number(mergeIndex)), - }; - } - // Executes the base rollup circuit and stored the output as intermediate state for the parent merge/root circuit // Executes the next level of merge if all inputs are available - private enqueueBaseRollup(provingState: ProvingState | undefined, index: bigint, tx: TxProvingState) { + private enqueueBaseRollup(provingState: BlockProvingState | undefined, index: bigint, tx: TxProvingState) { if (!provingState?.verifyState()) { logger.debug('Not running base rollup, state invalid'); return; @@ -724,7 +748,7 @@ export class ProvingOrchestrator implements BlockProver { { [Attributes.TX_HASH]: tx.processedTx.hash.toString(), [Attributes.PROTOCOL_CIRCUIT_TYPE]: 'server', - [Attributes.PROTOCOL_CIRCUIT_NAME]: 'base-rollup' as CircuitName, + [Attributes.PROTOCOL_CIRCUIT_NAME]: 'base-rollup' satisfies CircuitName, }, signal => this.prover.getBaseRollupProof(tx.baseRollupInputs, signal, provingState.epochNumber), ), @@ -743,7 +767,7 @@ export class ProvingOrchestrator implements BlockProver { // Enqueues the tub circuit for a given transaction index // Once completed, will enqueue the next circuit, either a public kernel or the base rollup - private enqueueTube(provingState: ProvingState, txIndex: number) { + private enqueueTube(provingState: BlockProvingState, txIndex: number) { if (!provingState?.verifyState()) { logger.debug('Not running tube circuit, state invalid'); return; @@ -760,7 +784,7 @@ export class ProvingOrchestrator implements BlockProver { { [Attributes.TX_HASH]: txProvingState.processedTx.hash.toString(), [Attributes.PROTOCOL_CIRCUIT_TYPE]: 'server', - [Attributes.PROTOCOL_CIRCUIT_NAME]: 'tube-circuit' as CircuitName, + [Attributes.PROTOCOL_CIRCUIT_NAME]: 'tube-circuit' satisfies CircuitName, }, signal => this.prover.getTubeProof( @@ -780,7 +804,7 @@ export class ProvingOrchestrator implements BlockProver { // Executes the merge rollup circuit and stored the output as intermediate state for the parent merge/block root circuit // Enqueues the next level of merge if all inputs are available private enqueueMergeRollup( - provingState: ProvingState, + provingState: BlockProvingState, level: bigint, index: bigint, mergeInputData: MergeRollupInputData, @@ -797,7 +821,7 @@ export class ProvingOrchestrator implements BlockProver { 'ProvingOrchestrator.prover.getMergeRollupProof', { [Attributes.PROTOCOL_CIRCUIT_TYPE]: 'server', - [Attributes.PROTOCOL_CIRCUIT_NAME]: 'merge-rollup' as CircuitName, + [Attributes.PROTOCOL_CIRCUIT_NAME]: 'merge-rollup' satisfies CircuitName, }, signal => this.prover.getMergeRollupProof(inputs, signal, provingState.epochNumber), ), @@ -812,29 +836,40 @@ export class ProvingOrchestrator implements BlockProver { } // Executes the block root rollup circuit - private async enqueueBlockRootRollup(provingState: ProvingState | undefined) { + private enqueueBlockRootRollup(provingState: BlockProvingState | undefined) { if (!provingState?.verifyState()) { - logger.debug('Not running root rollup, state no longer valid'); + logger.debug('Not running block root rollup, state no longer valid'); return; } const mergeInputData = provingState.getMergeInputs(0); const rootParityInput = provingState.finalRootParityInput!; - const inputs = await getBlockRootRollupInput( - mergeInputData.inputs[0]!, - mergeInputData.proofs[0]!, - mergeInputData.verificationKeys[0]!, - mergeInputData.inputs[1]!, - mergeInputData.proofs[1]!, - mergeInputData.verificationKeys[1]!, - rootParityInput, - provingState.newL1ToL2Messages, - provingState.messageTreeSnapshot, - provingState.messageTreeRootSiblingPath, - this.db, - this.proverId, + logger.debug( + `Enqueuing block root rollup for block ${provingState.blockNumber} with ${provingState.newL1ToL2Messages.length} l1 to l2 msgs`, ); + const previousRollupData: BlockRootRollupInputs['previousRollupData'] = makeTuple(2, i => + getPreviousRollupDataFromPublicInputs( + mergeInputData.inputs[i]!, + mergeInputData.proofs[i]!, + mergeInputData.verificationKeys[i]!, + ), + ); + + const inputs = BlockRootRollupInputs.from({ + previousRollupData, + l1ToL2Roots: rootParityInput, + newL1ToL2Messages: provingState.newL1ToL2Messages, + newL1ToL2MessageTreeRootSiblingPath: provingState.messageTreeRootSiblingPath, + startL1ToL2MessageTreeSnapshot: provingState.messageTreeSnapshot, + startArchiveSnapshot: provingState.archiveTreeSnapshot, + newArchiveSiblingPath: provingState.archiveTreeRootSiblingPath, + previousBlockHash: provingState.previousBlockHash, + proverId: this.proverId, + }); + + const shouldProveEpoch = this.provingState!.totalNumBlocks > 1; + this.deferredProving( provingState, wrapCallbackInSpan( @@ -842,25 +877,48 @@ export class ProvingOrchestrator implements BlockProver { 'ProvingOrchestrator.prover.getBlockRootRollupProof', { [Attributes.PROTOCOL_CIRCUIT_TYPE]: 'server', - [Attributes.PROTOCOL_CIRCUIT_NAME]: 'block-root-rollup' as CircuitName, + [Attributes.PROTOCOL_CIRCUIT_NAME]: 'block-root-rollup' satisfies CircuitName, }, - signal => this.prover.getBlockRootRollupProof(inputs, signal, provingState.epochNumber), + signal => + shouldProveEpoch + ? this.prover.getBlockRootRollupProof(inputs, signal, provingState.epochNumber) + : this.prover.getBlockRootRollupFinalProof(inputs, signal, provingState.epochNumber), ), result => { + const header = this.extractBlockHeaderFromPublicInputs(provingState, result.inputs); + if (!header.hash().equals(provingState.block!.header.hash())) { + logger.error( + `Block header mismatch\nCircuit:${inspect(header)}\nComputed:${inspect(provingState.block!.header)}`, + ); + provingState.reject(`Block header hash mismatch`); + } + provingState.blockRootRollupPublicInputs = result.inputs; provingState.finalProof = result.proof.binaryProof; + provingState.resolve({ status: PROVING_STATUS.SUCCESS }); + + logger.debug(`Completed proof for block root rollup for ${provingState.block?.number}`); + // validatePartialState(result.inputs.end, tx.treeSnapshots); // TODO(palla/prover) + + // TODO(palla/prover): Remove this once we've dropped the flow for proving single blocks + if (!shouldProveEpoch) { + logger.verbose(`Skipping epoch rollup, only one block in epoch`); + return; + } - const provingResult: ProvingResult = { - status: PROVING_STATUS.SUCCESS, - }; - provingState.resolve(provingResult); + const currentLevel = this.provingState!.numMergeLevels + 1n; + this.storeAndExecuteNextBlockMergeLevel(this.provingState!, currentLevel, BigInt(provingState.index), [ + result.inputs, + result.proof, + result.verificationKey.keyAsFields, + ]); }, ); } // Executes the base parity circuit and stores the intermediate state for the root parity circuit // Enqueues the root parity circuit if all inputs are available - private enqueueBaseParityCircuit(provingState: ProvingState, inputs: BaseParityInputs, index: number) { + private enqueueBaseParityCircuit(provingState: BlockProvingState, inputs: BaseParityInputs, index: number) { this.deferredProving( provingState, wrapCallbackInSpan( @@ -868,7 +926,7 @@ export class ProvingOrchestrator implements BlockProver { 'ProvingOrchestrator.prover.getBaseParityProof', { [Attributes.PROTOCOL_CIRCUIT_TYPE]: 'server', - [Attributes.PROTOCOL_CIRCUIT_NAME]: 'base-parity' as CircuitName, + [Attributes.PROTOCOL_CIRCUIT_NAME]: 'base-parity' satisfies CircuitName, }, signal => this.prover.getBaseParityProof(inputs, signal, provingState.epochNumber), ), @@ -889,7 +947,7 @@ export class ProvingOrchestrator implements BlockProver { // Runs the root parity circuit ans stored the outputs // Enqueues the root rollup proof if all inputs are available - private enqueueRootParityCircuit(provingState: ProvingState, inputs: RootParityInputs) { + private enqueueRootParityCircuit(provingState: BlockProvingState, inputs: RootParityInputs) { this.deferredProving( provingState, wrapCallbackInSpan( @@ -897,23 +955,104 @@ export class ProvingOrchestrator implements BlockProver { 'ProvingOrchestrator.prover.getRootParityProof', { [Attributes.PROTOCOL_CIRCUIT_TYPE]: 'server', - [Attributes.PROTOCOL_CIRCUIT_NAME]: 'root-parity' as CircuitName, + [Attributes.PROTOCOL_CIRCUIT_NAME]: 'root-parity' satisfies CircuitName, }, signal => this.prover.getRootParityProof(inputs, signal, provingState.epochNumber), ), - async rootInput => { + rootInput => { provingState!.finalRootParityInput = rootInput; - await this.checkAndEnqueueBlockRootRollup(provingState); + this.checkAndEnqueueBlockRootRollup(provingState); + }, + ); + } + + // Executes the block merge rollup circuit and stored the output as intermediate state for the parent merge/block root circuit + // Enqueues the next level of merge if all inputs are available + private enqueueBlockMergeRollup( + provingState: EpochProvingState, + level: bigint, + index: bigint, + mergeInputData: BlockMergeRollupInputData, + ) { + const inputs = createBlockMergeRollupInputs( + [mergeInputData.inputs[0]!, mergeInputData.proofs[0]!, mergeInputData.verificationKeys[0]!], + [mergeInputData.inputs[1]!, mergeInputData.proofs[1]!, mergeInputData.verificationKeys[1]!], + ); + + this.deferredProving( + provingState, + wrapCallbackInSpan( + this.tracer, + 'ProvingOrchestrator.prover.getBlockMergeRollupProof', + { + [Attributes.PROTOCOL_CIRCUIT_TYPE]: 'server', + [Attributes.PROTOCOL_CIRCUIT_NAME]: 'block-merge-rollup' satisfies CircuitName, + }, + signal => this.prover.getBlockMergeRollupProof(inputs, signal, provingState.epochNumber), + ), + result => { + this.storeAndExecuteNextBlockMergeLevel(provingState, level, index, [ + result.inputs, + result.proof, + result.verificationKey.keyAsFields, + ]); }, ); } - private async checkAndEnqueueBlockRootRollup(provingState: ProvingState | undefined) { + // Executes the root rollup circuit + private enqueueRootRollup(provingState: EpochProvingState | undefined) { + if (!provingState?.verifyState()) { + logger.debug('Not running root rollup, state no longer valid'); + return; + } + + logger.debug(`Preparing root rollup`); + const mergeInputData = provingState.getMergeInputs(0); + + const inputs = getRootRollupInput( + mergeInputData.inputs[0]!, + mergeInputData.proofs[0]!, + mergeInputData.verificationKeys[0]!, + mergeInputData.inputs[1]!, + mergeInputData.proofs[1]!, + mergeInputData.verificationKeys[1]!, + this.proverId, + ); + + this.deferredProving( + provingState, + wrapCallbackInSpan( + this.tracer, + 'ProvingOrchestrator.prover.getRootRollupProof', + { + [Attributes.PROTOCOL_CIRCUIT_TYPE]: 'server', + [Attributes.PROTOCOL_CIRCUIT_NAME]: 'root-rollup' satisfies CircuitName, + }, + signal => this.prover.getRootRollupProof(inputs, signal, provingState.epochNumber), + ), + result => { + provingState.rootRollupPublicInputs = result.inputs; + provingState.finalProof = result.proof.binaryProof; + provingState.resolve({ status: PROVING_STATUS.SUCCESS }); + }, + ); + } + + private checkAndEnqueueBlockRootRollup(provingState: BlockProvingState | undefined) { if (!provingState?.isReadyForBlockRootRollup()) { logger.debug('Not ready for root rollup'); return; } - await this.enqueueBlockRootRollup(provingState); + this.enqueueBlockRootRollup(provingState); + } + + private checkAndEnqueueRootRollup(provingState: EpochProvingState | undefined) { + if (!provingState?.isReadyForRootRollup()) { + logger.debug('Not ready for root rollup'); + return; + } + this.enqueueRootRollup(provingState); } /** @@ -924,7 +1063,7 @@ export class ProvingOrchestrator implements BlockProver { * @param mergeInputData - The inputs to be stored */ private storeAndExecuteNextMergeLevel( - provingState: ProvingState, + provingState: BlockProvingState, currentLevel: bigint, currentIndex: bigint, mergeInputData: [ @@ -933,19 +1072,68 @@ export class ProvingOrchestrator implements BlockProver { VerificationKeyAsFields, ], ) { - const result = this.storeMergeInputs(provingState, currentLevel, currentIndex, mergeInputData); + const [mergeLevel, indexWithinMergeLevel, indexWithinMerge] = provingState.findMergeLevel( + currentLevel, + currentIndex, + ); + const mergeIndex = 2n ** mergeLevel - 1n + indexWithinMergeLevel; + const ready = provingState.storeMergeInputs(mergeInputData, Number(indexWithinMerge), Number(mergeIndex)); + const nextMergeInputData = provingState.getMergeInputs(Number(mergeIndex)); + + // Are we ready to execute the next circuit? + if (!ready) { + return; + } + + if (mergeLevel === 0n) { + this.checkAndEnqueueBlockRootRollup(provingState); + } else { + // onto the next merge level + this.enqueueMergeRollup(provingState, mergeLevel, indexWithinMergeLevel, nextMergeInputData); + } + } + + /** + * Stores the inputs to a block merge/root circuit and enqueues the circuit if ready + * @param provingState - The proving state being operated on + * @param currentLevel - The level of the merge/root circuit + * @param currentIndex - The index of the merge/root circuit + * @param mergeInputData - The inputs to be stored + */ + private storeAndExecuteNextBlockMergeLevel( + provingState: EpochProvingState, + currentLevel: bigint, + currentIndex: bigint, + mergeInputData: [ + BlockRootOrBlockMergePublicInputs, + RecursiveProof, + VerificationKeyAsFields, + ], + ) { + const [mergeLevel, indexWithinMergeLevel, indexWithinMerge] = provingState.findMergeLevel( + currentLevel, + currentIndex, + ); + logger.debug(`Computed merge for ${currentLevel}.${currentIndex} as ${mergeLevel}.${indexWithinMergeLevel}`); + if (mergeLevel < 0n) { + throw new Error(`Invalid merge level ${mergeLevel}`); + } + + const mergeIndex = 2n ** mergeLevel - 1n + indexWithinMergeLevel; + const ready = provingState.storeMergeInputs(mergeInputData, Number(indexWithinMerge), Number(mergeIndex)); + const nextMergeInputData = provingState.getMergeInputs(Number(mergeIndex)); // Are we ready to execute the next circuit? - if (!result.ready) { + if (!ready) { + logger.debug(`Not ready to execute next block merge for level ${mergeLevel} index ${indexWithinMergeLevel}`); return; } - if (result.mergeLevel === 0n) { - // TODO (alexg) remove this `void` - void this.checkAndEnqueueBlockRootRollup(provingState); + if (mergeLevel === 0n) { + this.checkAndEnqueueRootRollup(provingState); } else { // onto the next merge level - this.enqueueMergeRollup(provingState, result.mergeLevel, result.indexWithinMergeLevel, result.mergeInputData); + this.enqueueBlockMergeRollup(provingState, mergeLevel, indexWithinMergeLevel, nextMergeInputData); } } @@ -956,7 +1144,7 @@ export class ProvingOrchestrator implements BlockProver { * @param txIndex - The index of the transaction being proven * @param functionIndex - The index of the function/kernel being proven */ - private enqueueVM(provingState: ProvingState | undefined, txIndex: number, functionIndex: number) { + private enqueueVM(provingState: BlockProvingState | undefined, txIndex: number, functionIndex: number) { if (!provingState?.verifyState()) { logger.debug(`Not running VM circuit as state is no longer valid`); return; @@ -1006,7 +1194,7 @@ export class ProvingOrchestrator implements BlockProver { } private checkAndEnqueuePublicKernelFromVMProof( - provingState: ProvingState, + provingState: BlockProvingState, txIndex: number, functionIndex: number, vmProof: Proof, @@ -1027,7 +1215,7 @@ export class ProvingOrchestrator implements BlockProver { // This could be either a public kernel or the base rollup // Alternatively, if we are still waiting on a public VM prof then it will continue waiting private checkAndEnqueueNextTxCircuit( - provingState: ProvingState, + provingState: BlockProvingState, txIndex: number, proof: RecursiveProof | RecursiveProof, verificationKey: VerificationKeyData, @@ -1070,7 +1258,7 @@ export class ProvingOrchestrator implements BlockProver { * @param txIndex - The index of the transaction being proven * @param functionIndex - The index of the function/kernel being proven */ - private enqueuePublicKernel(provingState: ProvingState | undefined, txIndex: number, functionIndex: number) { + private enqueuePublicKernel(provingState: BlockProvingState | undefined, txIndex: number, functionIndex: number) { if (!provingState?.verifyState()) { logger.debug(`Not running public kernel circuit as state is no longer valid`); return; diff --git a/yarn-project/prover-client/src/orchestrator/orchestrator_errors.test.ts b/yarn-project/prover-client/src/orchestrator/orchestrator_errors.test.ts index 768304c77a0f..01bdd4006436 100644 --- a/yarn-project/prover-client/src/orchestrator/orchestrator_errors.test.ts +++ b/yarn-project/prover-client/src/orchestrator/orchestrator_errors.test.ts @@ -46,6 +46,16 @@ describe('prover/orchestrator/errors', () => { expect(finalisedBlock.block.number).toEqual(context.blockNumber); }); + it('throws if adding too many blocks', async () => { + context.orchestrator.startNewEpoch(1, 1); + await context.orchestrator.startNewBlock(2, context.globalVariables, []); + await context.orchestrator.setBlockCompleted(); + + await expect( + async () => await context.orchestrator.startNewBlock(2, context.globalVariables, []), + ).rejects.toThrow('Epoch not accepting further blocks'); + }); + it('throws if adding a transaction before start', async () => { await expect( async () => await context.orchestrator.addNewTx(makeEmptyProcessedTestTx(context.actualDb)), @@ -71,27 +81,10 @@ describe('prover/orchestrator/errors', () => { ); }); - it('throws if finalising an already finalised block', async () => { - const txs = await Promise.all([ - makeEmptyProcessedTestTx(context.actualDb), - makeEmptyProcessedTestTx(context.actualDb), - ]); - - const blockTicket = await context.orchestrator.startNewBlock(txs.length, context.globalVariables, []); - - await context.orchestrator.setBlockCompleted(); - - const result = await blockTicket.provingPromise; - expect(result.status).toBe(PROVING_STATUS.SUCCESS); - const finalisedBlock = await context.orchestrator.finaliseBlock(); - expect(finalisedBlock.block.number).toEqual(context.blockNumber); - await expect(async () => await context.orchestrator.finaliseBlock()).rejects.toThrow('Block already finalised'); - }); - it('throws if adding to a cancelled block', async () => { await context.orchestrator.startNewBlock(2, context.globalVariables, []); - context.orchestrator.cancelBlock(); + context.orchestrator.cancel(); await expect( async () => await context.orchestrator.addNewTx(makeEmptyProcessedTestTx(context.actualDb)), diff --git a/yarn-project/prover-client/src/orchestrator/orchestrator_failures.test.ts b/yarn-project/prover-client/src/orchestrator/orchestrator_failures.test.ts index d4ab78cb91c4..a3bd36fe359d 100644 --- a/yarn-project/prover-client/src/orchestrator/orchestrator_failures.test.ts +++ b/yarn-project/prover-client/src/orchestrator/orchestrator_failures.test.ts @@ -1,4 +1,6 @@ import { PROVING_STATUS, type ServerCircuitProver } from '@aztec/circuit-types'; +import { Fr } from '@aztec/circuits.js'; +import { times } from '@aztec/foundation/collection'; import { createDebugLogger } from '@aztec/foundation/log'; import { WASMSimulator } from '@aztec/simulator'; import { NoopTelemetryClient } from '@aztec/telemetry-client/noop'; @@ -6,7 +8,7 @@ import { NoopTelemetryClient } from '@aztec/telemetry-client/noop'; import { jest } from '@jest/globals'; import { TestCircuitProver } from '../../../bb-prover/src/test/test_circuit_prover.js'; -import { makeBloatedProcessedTx } from '../mocks/fixtures.js'; +import { makeBloatedProcessedTx, makeGlobals } from '../mocks/fixtures.js'; import { TestContext } from '../mocks/test_context.js'; import { ProvingOrchestrator } from './orchestrator.js'; @@ -51,19 +53,18 @@ describe('prover/orchestrator/failures', () => { jest.spyOn(mockProver, 'getBlockRootRollupProof').mockRejectedValue('Block Root Rollup Failed'); }, ], - // TODO(#7346): Integrate batch rollup circuits into orchestrator and test here - // [ - // 'Block Merge Rollup Failed', - // () => { - // jest.spyOn(mockProver, 'getBlockMergeRollupProof').mockRejectedValue('Block Merge Rollup Failed'); - // }, - // ], - // [ - // 'Root Rollup Failed', - // () => { - // jest.spyOn(mockProver, 'getRootRollupProof').mockRejectedValue('Root Rollup Failed'); - // }, - // ], + [ + 'Block Merge Rollup Failed', + () => { + jest.spyOn(mockProver, 'getBlockMergeRollupProof').mockRejectedValue('Block Merge Rollup Failed'); + }, + ], + [ + 'Root Rollup Failed', + () => { + jest.spyOn(mockProver, 'getRootRollupProof').mockRejectedValue('Root Rollup Failed'); + }, + ], [ 'Base Parity Failed', () => { @@ -78,18 +79,20 @@ describe('prover/orchestrator/failures', () => { ], ] as const)('handles a %s error', async (message: string, fn: () => void) => { fn(); - const txs = [ - makeBloatedProcessedTx(context.actualDb, 1), - makeBloatedProcessedTx(context.actualDb, 2), - makeBloatedProcessedTx(context.actualDb, 3), - ]; - const blockTicket = await orchestrator.startNewBlock(txs.length, context.globalVariables, []); + const epochTicket = orchestrator.startNewEpoch(1, 3); - for (const tx of txs) { - await orchestrator.addNewTx(tx); + // We need at least 3 blocks and 3 txs to ensure all circuits are used + for (let i = 0; i < 3; i++) { + const txs = times(3, j => makeBloatedProcessedTx(context.actualDb, i * 10 + j + 1)); + const msgs = [new Fr(i + 100)]; + await orchestrator.startNewBlock(txs.length, makeGlobals(i + 1), msgs); + for (const tx of txs) { + await orchestrator.addNewTx(tx); + } } - await expect(blockTicket.provingPromise).resolves.toEqual({ status: PROVING_STATUS.FAILURE, reason: message }); + + await expect(epochTicket.provingPromise).resolves.toEqual({ status: PROVING_STATUS.FAILURE, reason: message }); }); }); }); diff --git a/yarn-project/prover-client/src/orchestrator/orchestrator_lifecycle.test.ts b/yarn-project/prover-client/src/orchestrator/orchestrator_lifecycle.test.ts index 4e669651a27a..3570e0af5919 100644 --- a/yarn-project/prover-client/src/orchestrator/orchestrator_lifecycle.test.ts +++ b/yarn-project/prover-client/src/orchestrator/orchestrator_lifecycle.test.ts @@ -1,11 +1,6 @@ -import { PROVING_STATUS, type ProvingFailure, type ServerCircuitProver } from '@aztec/circuit-types'; -import { - type GlobalVariables, - NUMBER_OF_L1_L2_MESSAGES_PER_ROLLUP, - NUM_BASE_PARITY_PER_ROOT_PARITY, -} from '@aztec/circuits.js'; -import { fr, makeGlobalVariables } from '@aztec/circuits.js/testing'; -import { range } from '@aztec/foundation/array'; +import { type ServerCircuitProver } from '@aztec/circuit-types'; +import { NUM_BASE_PARITY_PER_ROOT_PARITY } from '@aztec/circuits.js'; +import { makeGlobalVariables } from '@aztec/circuits.js/testing'; import { createDebugLogger } from '@aztec/foundation/log'; import { type PromiseWithResolvers, promiseWithResolvers } from '@aztec/foundation/promise'; import { sleep } from '@aztec/foundation/sleep'; @@ -14,7 +9,6 @@ import { NoopTelemetryClient } from '@aztec/telemetry-client/noop'; import { jest } from '@jest/globals'; import { TestCircuitProver } from '../../../bb-prover/src/test/test_circuit_prover.js'; -import { makeBloatedProcessedTx, makeGlobals } from '../mocks/fixtures.js'; import { TestContext } from '../mocks/test_context.js'; import { ProvingOrchestrator } from './orchestrator.js'; @@ -32,77 +26,6 @@ describe('prover/orchestrator/lifecycle', () => { }); describe('lifecycle', () => { - it('cancels current block and switches to new ones', async () => { - const txs1 = [makeBloatedProcessedTx(context.actualDb, 1), makeBloatedProcessedTx(context.actualDb, 2)]; - - const txs2 = [makeBloatedProcessedTx(context.actualDb, 3), makeBloatedProcessedTx(context.actualDb, 4)]; - - const globals1: GlobalVariables = makeGlobals(100); - const globals2: GlobalVariables = makeGlobals(101); - - const l1ToL2Messages = range(NUMBER_OF_L1_L2_MESSAGES_PER_ROLLUP, 1 + 0x400).map(fr); - - const blockTicket1 = await context.orchestrator.startNewBlock(2, globals1, l1ToL2Messages); - - await context.orchestrator.addNewTx(txs1[0]); - await context.orchestrator.addNewTx(txs1[1]); - - // Now we cancel the block. The first block will come to a stop as and when current proofs complete - context.orchestrator.cancelBlock(); - - const result1 = await blockTicket1.provingPromise; - - // in all likelihood, the block will have a failure code as we cancelled it - // however it may have actually completed proving before we cancelled in which case it could be a success code - if (result1.status === PROVING_STATUS.FAILURE) { - expect((result1 as ProvingFailure).reason).toBe('Proving cancelled'); - } - - await context.actualDb.rollback(); - - const blockTicket2 = await context.orchestrator.startNewBlock(2, globals2, l1ToL2Messages); - - await context.orchestrator.addNewTx(txs2[0]); - await context.orchestrator.addNewTx(txs2[1]); - - const result2 = await blockTicket2.provingPromise; - expect(result2.status).toBe(PROVING_STATUS.SUCCESS); - const finalisedBlock = await context.orchestrator.finaliseBlock(); - - expect(finalisedBlock.block.number).toEqual(101); - }); - - it('automatically cancels an incomplete block when starting a new one', async () => { - const txs1 = [makeBloatedProcessedTx(context.actualDb, 1), makeBloatedProcessedTx(context.actualDb, 2)]; - const txs2 = [makeBloatedProcessedTx(context.actualDb, 3), makeBloatedProcessedTx(context.actualDb, 4)]; - - const globals1: GlobalVariables = makeGlobals(100); - const globals2: GlobalVariables = makeGlobals(101); - - const l1ToL2Messages = range(NUMBER_OF_L1_L2_MESSAGES_PER_ROLLUP, 1 + 0x400).map(fr); - - const blockTicket1 = await context.orchestrator.startNewBlock(2, globals1, l1ToL2Messages); - - await context.orchestrator.addNewTx(txs1[0]); - - await context.actualDb.rollback(); - - const blockTicket2 = await context.orchestrator.startNewBlock(2, globals2, l1ToL2Messages); - - await context.orchestrator.addNewTx(txs2[0]); - await context.orchestrator.addNewTx(txs2[1]); - - const result1 = await blockTicket1.provingPromise; - expect(result1.status).toBe(PROVING_STATUS.FAILURE); - expect((result1 as ProvingFailure).reason).toBe('Proving cancelled'); - - const result2 = await blockTicket2.provingPromise; - expect(result2.status).toBe(PROVING_STATUS.SUCCESS); - const finalisedBlock = await context.orchestrator.finaliseBlock(); - - expect(finalisedBlock.block.number).toEqual(101); - }, 60000); - it('cancels proving requests', async () => { const prover: ServerCircuitProver = new TestCircuitProver(new NoopTelemetryClient()); const orchestrator = new ProvingOrchestrator(context.actualDb, prover, new NoopTelemetryClient()); @@ -121,7 +44,7 @@ describe('prover/orchestrator/lifecycle', () => { expect(spy).toHaveBeenCalledTimes(NUM_BASE_PARITY_PER_ROOT_PARITY); expect(spy.mock.calls.every(([_, signal]) => !signal?.aborted)).toBeTruthy(); - orchestrator.cancelBlock(); + orchestrator.cancel(); expect(spy.mock.calls.every(([_, signal]) => signal?.aborted)).toBeTruthy(); }); }); diff --git a/yarn-project/prover-client/src/orchestrator/orchestrator_multiple_blocks.test.ts b/yarn-project/prover-client/src/orchestrator/orchestrator_multiple_blocks.test.ts index 814461cd36c7..7795c423e77c 100644 --- a/yarn-project/prover-client/src/orchestrator/orchestrator_multiple_blocks.test.ts +++ b/yarn-project/prover-client/src/orchestrator/orchestrator_multiple_blocks.test.ts @@ -19,17 +19,17 @@ describe('prover/orchestrator/multi-block', () => { }); describe('multiple blocks', () => { - it('builds multiple blocks in sequence', async () => { - const numBlocks = 5; + it.each([4, 5])('builds an epoch with %s blocks in sequence', async (numBlocks: number) => { + const provingTicket = context.orchestrator.startNewEpoch(1, numBlocks); let header = context.actualDb.getInitialHeader(); for (let i = 0; i < numBlocks; i++) { + logger.info(`Creating block ${i + 1000}`); const tx = makeBloatedProcessedTx(context.actualDb, i + 1); tx.data.constants.historicalHeader = header; tx.data.constants.vkTreeRoot = getVKTreeRoot(); const blockNum = i + 1000; - const globals = makeGlobals(blockNum); // This will need to be a 2 tx block @@ -46,9 +46,15 @@ describe('prover/orchestrator/multi-block', () => { expect(finalisedBlock.block.number).toEqual(blockNum); header = finalisedBlock.block.header; - - await context.actualDb.commit(); } + + logger.info('Awaiting epoch ticket'); + const result = await provingTicket.provingPromise; + expect(result).toEqual({ status: PROVING_STATUS.SUCCESS }); + + const epoch = context.orchestrator.finaliseEpoch(); + expect(epoch.publicInputs.endBlockNumber.toNumber()).toEqual(1000 + numBlocks - 1); + expect(epoch.proof).toBeDefined(); }); }); }); diff --git a/yarn-project/prover-client/src/orchestrator/orchestrator_workflow.test.ts b/yarn-project/prover-client/src/orchestrator/orchestrator_workflow.test.ts index b49f7cddb134..9d03a31a25ea 100644 --- a/yarn-project/prover-client/src/orchestrator/orchestrator_workflow.test.ts +++ b/yarn-project/prover-client/src/orchestrator/orchestrator_workflow.test.ts @@ -64,7 +64,7 @@ describe('prover/orchestrator', () => { await sleep(10); expect(mockProver.getRootParityProof).toHaveBeenCalledTimes(1); - orchestrator.cancelBlock(); + orchestrator.cancel(); }); }); }); diff --git a/yarn-project/prover-client/src/prover-agent/agent-queue-rpc-integration.test.ts b/yarn-project/prover-client/src/prover-agent/agent-queue-rpc-integration.test.ts index efdab789d640..cd30492f0151 100644 --- a/yarn-project/prover-client/src/prover-agent/agent-queue-rpc-integration.test.ts +++ b/yarn-project/prover-client/src/prover-agent/agent-queue-rpc-integration.test.ts @@ -41,6 +41,7 @@ describe('Prover agent <-> queue integration', () => { getRootParityProof: makeRootParityInputs, getBlockMergeRollupProof: makeBlockMergeRollupInputs, getBlockRootRollupProof: makeBlockRootRollupInputs, + getBlockRootRollupFinalProof: makeBlockRootRollupInputs, getEmptyPrivateKernelProof: () => new PrivateKernelEmptyInputData(makeHeader(), Fr.random(), Fr.random(), Fr.random()), getEmptyTubeProof: () => new PrivateKernelEmptyInputData(makeHeader(), Fr.random(), Fr.random(), Fr.random()), diff --git a/yarn-project/prover-client/src/prover-agent/memory-proving-queue.ts b/yarn-project/prover-client/src/prover-agent/memory-proving-queue.ts index f5b35a691927..6f54ae5fe873 100644 --- a/yarn-project/prover-client/src/prover-agent/memory-proving-queue.ts +++ b/yarn-project/prover-client/src/prover-agent/memory-proving-queue.ts @@ -351,6 +351,14 @@ export class MemoryProvingQueue implements ServerCircuitProver, ProvingJobSource return this.enqueue({ type: ProvingRequestType.BLOCK_ROOT_ROLLUP, inputs: input }, signal, epochNumber); } + getBlockRootRollupFinalProof( + input: BlockRootRollupInputs, + signal?: AbortSignal, + epochNumber?: number, + ): Promise> { + return this.enqueue({ type: ProvingRequestType.BLOCK_ROOT_ROLLUP_FINAL, inputs: input }, signal, epochNumber); + } + /** * Creates a proof for the given input. * @param input - Input to the circuit. diff --git a/yarn-project/prover-client/src/prover-agent/prover-agent.ts b/yarn-project/prover-client/src/prover-agent/prover-agent.ts index f65516bd0871..a4af27153d71 100644 --- a/yarn-project/prover-client/src/prover-agent/prover-agent.ts +++ b/yarn-project/prover-client/src/prover-agent/prover-agent.ts @@ -185,6 +185,10 @@ export class ProverAgent { return this.circuitProver.getBlockRootRollupProof(inputs); } + case ProvingRequestType.BLOCK_ROOT_ROLLUP_FINAL: { + return this.circuitProver.getBlockRootRollupFinalProof(inputs); + } + case ProvingRequestType.BLOCK_MERGE_ROLLUP: { return this.circuitProver.getBlockMergeRollupProof(inputs); } diff --git a/yarn-project/prover-client/src/test/bb_prover_full_rollup.test.ts b/yarn-project/prover-client/src/test/bb_prover_full_rollup.test.ts index 8ffad1ac338c..fdecf4270cab 100644 --- a/yarn-project/prover-client/src/test/bb_prover_full_rollup.test.ts +++ b/yarn-project/prover-client/src/test/bb_prover_full_rollup.test.ts @@ -66,7 +66,7 @@ describe('prover/bb_prover/full-rollup', () => { logger.info(`Finalising block`); const blockResult = await context.orchestrator.finaliseBlock(); - await expect(prover.verifyProof('BlockRootRollupArtifact', blockResult.proof)).resolves.not.toThrow(); + await expect(prover.verifyProof('BlockRootRollupFinalArtifact', blockResult.proof)).resolves.not.toThrow(); }); // TODO(@PhilWindle): Remove public functions and re-enable once we can handle empty tx slots diff --git a/yarn-project/prover-client/src/test/mock_prover.ts b/yarn-project/prover-client/src/test/mock_prover.ts index 6c4fcdc7ed41..021832713a7a 100644 --- a/yarn-project/prover-client/src/test/mock_prover.ts +++ b/yarn-project/prover-client/src/test/mock_prover.ts @@ -88,6 +88,16 @@ export class MockProver implements ServerCircuitProver { ); } + getBlockRootRollupFinalProof(): Promise> { + return Promise.resolve( + makePublicInputsAndRecursiveProof( + makeBlockRootOrBlockMergeRollupPublicInputs(), + makeRecursiveProof(RECURSIVE_PROOF_LENGTH), + VerificationKeyData.makeFake(), + ), + ); + } + getEmptyPrivateKernelProof(): Promise> { return Promise.resolve( makePublicInputsAndRecursiveProof( diff --git a/yarn-project/prover-node/src/job/block-proving-job.ts b/yarn-project/prover-node/src/job/block-proving-job.ts index a9fb9c822d86..1f8e70f09b17 100644 --- a/yarn-project/prover-node/src/job/block-proving-job.ts +++ b/yarn-project/prover-node/src/job/block-proving-job.ts @@ -129,7 +129,7 @@ export class BlockProvingJob { } public stop() { - this.prover.cancelBlock(); + this.prover.cancel(); } private async getBlock(blockNumber: number): Promise { diff --git a/yarn-project/sequencer-client/src/block_builder/light.ts b/yarn-project/sequencer-client/src/block_builder/light.ts index 2dbbdb75abe4..7443087bd6a4 100644 --- a/yarn-project/sequencer-client/src/block_builder/light.ts +++ b/yarn-project/sequencer-client/src/block_builder/light.ts @@ -9,30 +9,21 @@ import { type ProcessedTx, type ProvingTicket, type SimulationBlockResult, - TxEffect, + type TxEffect, makeEmptyProcessedTx, toTxEffect, } from '@aztec/circuit-types'; import { - ContentCommitment, Fr, type GlobalVariables, - Header, - MerkleTreeCalculator, NESTED_RECURSIVE_PROOF_LENGTH, NUMBER_OF_L1_L2_MESSAGES_PER_ROLLUP, - NUM_BASE_PARITY_PER_ROOT_PARITY, - NUM_MSGS_PER_BASE_PARITY, - PartialStateReference, - StateReference, VerificationKeyData, makeEmptyRecursiveProof, } from '@aztec/circuits.js'; import { padArrayEnd } from '@aztec/foundation/collection'; -import { sha256Trunc } from '@aztec/foundation/crypto'; -import { computeUnbalancedMerkleRoot } from '@aztec/foundation/trees'; import { getVKTreeRoot } from '@aztec/noir-protocol-circuits-types'; -import { buildBaseRollupInput, getTreeSnapshot } from '@aztec/prover-client/helpers'; +import { buildBaseRollupInput, buildHeaderFromTxEffects, getTreeSnapshot } from '@aztec/prover-client/helpers'; import { type TelemetryClient } from '@aztec/telemetry-client'; import { NoopTelemetryClient } from '@aztec/telemetry-client/noop'; @@ -78,7 +69,7 @@ export class LightweightBlockBuilder implements BlockSimulator { ); } - cancelBlock(): void {} + cancel(): void {} async setBlockCompleted(): Promise { const paddingTxCount = this.numTxs! - this.txs.length; @@ -101,7 +92,7 @@ export class LightweightBlockBuilder implements BlockSimulator { .map(tx => toTxEffect(tx, this.globalVariables!.gasFees)) .filter(txEffect => !txEffect.isEmpty()); const body = new Body(nonEmptyTxEffects); - const header = await this.makeHeader(body); + const header = await buildHeaderFromTxEffects(body, this.globalVariables!, this.l1ToL2Messages!, this.db); await this.db.updateArchive(header); const newArchive = await getTreeSnapshot(MerkleTreeId.ARCHIVE, this.db); @@ -109,50 +100,6 @@ export class LightweightBlockBuilder implements BlockSimulator { const block = new L2Block(newArchive, header, body); return { block }; } - - private async makeHeader(body: Body): Promise
{ - const { db } = this; - - const stateReference = new StateReference( - await getTreeSnapshot(MerkleTreeId.L1_TO_L2_MESSAGE_TREE, db), - new PartialStateReference( - await getTreeSnapshot(MerkleTreeId.NOTE_HASH_TREE, db), - await getTreeSnapshot(MerkleTreeId.NULLIFIER_TREE, db), - await getTreeSnapshot(MerkleTreeId.PUBLIC_DATA_TREE, db), - ), - ); - - const previousArchive = await getTreeSnapshot(MerkleTreeId.ARCHIVE, db); - - const outHash = computeUnbalancedMerkleRoot( - body.txEffects.map(tx => tx.txOutHash()), - TxEffect.empty().txOutHash(), - ); - - const paritySize = NUM_BASE_PARITY_PER_ROOT_PARITY * NUM_MSGS_PER_BASE_PARITY; - const parityHeight = Math.ceil(Math.log2(paritySize)); - const hasher = (left: Buffer, right: Buffer) => sha256Trunc(Buffer.concat([left, right])); - const parityShaRoot = new MerkleTreeCalculator(parityHeight, Fr.ZERO.toBuffer(), hasher).computeTreeRoot( - this.l1ToL2Messages!.map(msg => msg.toBuffer()), - ); - - const contentCommitment = new ContentCommitment( - new Fr(this.numTxs!), - body.getTxsEffectsHash(), - parityShaRoot, - outHash, - ); - - const fees = this.txs!.reduce( - (acc, tx) => - acc - .add(tx.data.constants.txContext.gasSettings.inclusionFee) - .add(tx.data.end.gasUsed.computeFee(this.globalVariables!.gasFees)), - Fr.ZERO, - ); - - return new Header(previousArchive, contentCommitment, stateReference, this.globalVariables!, fees); - } } export class LightweightBlockBuilderFactory { diff --git a/yarn-project/sequencer-client/src/block_builder/orchestrator.ts b/yarn-project/sequencer-client/src/block_builder/orchestrator.ts index c415e7913427..31f00f26ea55 100644 --- a/yarn-project/sequencer-client/src/block_builder/orchestrator.ts +++ b/yarn-project/sequencer-client/src/block_builder/orchestrator.ts @@ -28,8 +28,8 @@ export class OrchestratorBlockBuilder implements BlockSimulator { startNewBlock(numTxs: number, globalVariables: GlobalVariables, l1ToL2Messages: Fr[]): Promise { return this.orchestrator.startNewBlock(numTxs, globalVariables, l1ToL2Messages); } - cancelBlock(): void { - this.orchestrator.cancelBlock(); + cancel(): void { + this.orchestrator.cancel(); } finaliseBlock(): Promise { return this.orchestrator.finaliseBlock(); diff --git a/yarn-project/sequencer-client/src/sequencer/sequencer.test.ts b/yarn-project/sequencer-client/src/sequencer/sequencer.test.ts index 004b862bacc1..2b20ac571138 100644 --- a/yarn-project/sequencer-client/src/sequencer/sequencer.test.ts +++ b/yarn-project/sequencer-client/src/sequencer/sequencer.test.ts @@ -209,7 +209,7 @@ describe('sequencer', () => { // Ok, we have an issue that we never actually call the process L2 block expect(publisher.proposeL2Block).toHaveBeenCalledTimes(1); expect(publisher.proposeL2Block).toHaveBeenCalledWith(block, getSignatures(), [txHash]); - expect(blockSimulator.cancelBlock).toHaveBeenCalledTimes(0); + expect(blockSimulator.cancel).toHaveBeenCalledTimes(0); }); it('builds a block when it is their turn', async () => { @@ -258,7 +258,7 @@ describe('sequencer', () => { Array(NUMBER_OF_L1_L2_MESSAGES_PER_ROLLUP).fill(new Fr(0n)), ); expect(publisher.proposeL2Block).toHaveBeenCalledWith(block, getSignatures(), [txHash]); - expect(blockSimulator.cancelBlock).toHaveBeenCalledTimes(0); + expect(blockSimulator.cancel).toHaveBeenCalledTimes(0); }); it('builds a block out of several txs rejecting double spends', async () => { @@ -302,7 +302,7 @@ describe('sequencer', () => { ); expect(publisher.proposeL2Block).toHaveBeenCalledWith(block, getSignatures(), validTxHashes); expect(p2p.deleteTxs).toHaveBeenCalledWith([doubleSpendTx.getTxHash()]); - expect(blockSimulator.cancelBlock).toHaveBeenCalledTimes(0); + expect(blockSimulator.cancel).toHaveBeenCalledTimes(0); }); it('builds a block out of several txs rejecting incorrect chain ids', async () => { @@ -341,7 +341,7 @@ describe('sequencer', () => { ); expect(publisher.proposeL2Block).toHaveBeenCalledWith(block, getSignatures(), validTxHashes); expect(p2p.deleteTxs).toHaveBeenCalledWith([invalidChainTx.getTxHash()]); - expect(blockSimulator.cancelBlock).toHaveBeenCalledTimes(0); + expect(blockSimulator.cancel).toHaveBeenCalledTimes(0); }); it('builds a block out of several txs dropping the ones that go over max size', async () => { @@ -381,7 +381,7 @@ describe('sequencer', () => { Array(NUMBER_OF_L1_L2_MESSAGES_PER_ROLLUP).fill(new Fr(0n)), ); expect(publisher.proposeL2Block).toHaveBeenCalledWith(block, getSignatures(), validTxHashes); - expect(blockSimulator.cancelBlock).toHaveBeenCalledTimes(0); + expect(blockSimulator.cancel).toHaveBeenCalledTimes(0); }); it('builds a block once it reaches the minimum number of transactions', async () => { @@ -432,7 +432,7 @@ describe('sequencer', () => { ); expect(publisher.proposeL2Block).toHaveBeenCalledTimes(1); expect(publisher.proposeL2Block).toHaveBeenCalledWith(block, getSignatures(), txHashes); - expect(blockSimulator.cancelBlock).toHaveBeenCalledTimes(0); + expect(blockSimulator.cancel).toHaveBeenCalledTimes(0); }); it('builds a block that contains zero real transactions once flushed', async () => { @@ -483,7 +483,7 @@ describe('sequencer', () => { ); expect(publisher.proposeL2Block).toHaveBeenCalledTimes(1); expect(publisher.proposeL2Block).toHaveBeenCalledWith(block, getSignatures(), []); - expect(blockSimulator.cancelBlock).toHaveBeenCalledTimes(0); + expect(blockSimulator.cancel).toHaveBeenCalledTimes(0); }); it('builds a block that contains less than the minimum number of transactions once flushed', async () => { @@ -537,7 +537,7 @@ describe('sequencer', () => { expect(publisher.proposeL2Block).toHaveBeenCalledTimes(1); expect(publisher.proposeL2Block).toHaveBeenCalledWith(block, getSignatures(), postFlushTxHashes); - expect(blockSimulator.cancelBlock).toHaveBeenCalledTimes(0); + expect(blockSimulator.cancel).toHaveBeenCalledTimes(0); }); it('aborts building a block if the chain moves underneath it', async () => { diff --git a/yarn-project/sequencer-client/src/sequencer/sequencer.ts b/yarn-project/sequencer-client/src/sequencer/sequencer.ts index 69b8398d1522..04426ee93d06 100644 --- a/yarn-project/sequencer-client/src/sequencer/sequencer.ts +++ b/yarn-project/sequencer-client/src/sequencer/sequencer.ts @@ -443,7 +443,7 @@ export class Sequencer { processedTxsCount: processedTxs.length, }) ) { - blockBuilder.cancelBlock(); + blockBuilder.cancel(); throw new Error('Should not propose the block'); } diff --git a/yarn-project/telemetry-client/src/attributes.ts b/yarn-project/telemetry-client/src/attributes.ts index 34ad82007fd4..4a320161d699 100644 --- a/yarn-project/telemetry-client/src/attributes.ts +++ b/yarn-project/telemetry-client/src/attributes.ts @@ -48,6 +48,10 @@ export const BLOCK_CANDIDATE_TXS_COUNT = 'aztec.block.candidate_txs_count'; export const BLOCK_TXS_COUNT = 'aztec.block.txs_count'; /** The block size (power of 2) */ export const BLOCK_SIZE = 'aztec.block.size'; +/** How many blocks are included in this epoch */ +export const EPOCH_SIZE = 'aztec.epoch.size'; +/** The epoch number */ +export const EPOCH_NUMBER = 'aztec.epoch.number'; /** The tx hash */ export const TX_HASH = 'aztec.tx.hash'; /** Generic attribute representing whether the action was successful or not */