feat: WIP batch blobs and validate in rollup#13817
Merged
MirandaWood merged 88 commits intomw/blob-batchingfrom Jun 4, 2025
Merged
feat: WIP batch blobs and validate in rollup#13817MirandaWood merged 88 commits intomw/blob-batchingfrom
MirandaWood merged 88 commits intomw/blob-batchingfrom
Conversation
This reverts commit 68be71e.
…atching-bls-utils
… mw/blob-batching-bls-utils-ts
…atching-bls-utils
… mw/blob-batching-bls-utils-ts
… mw/blob-batching-bls-utils-ts
5 tasks
Member
|
I've updated this PR to pull a version of noir which includes a fix for the circuit simulator. |
…atching-bls-utils
… mw/blob-batching-bls-utils-ts
…nto mw/blob-batching-integration
LeilaWang
approved these changes
May 29, 2025
...ol-circuits/crates/rollup-lib/src/block_root/components/block_root_rollup_output_composer.nr
Outdated
Show resolved
Hide resolved
noir-projects/noir-protocol-circuits/crates/types/src/constants.nr
Outdated
Show resolved
Hide resolved
… mw/blob-batching-bls-utils-ts
…nto mw/blob-batching-integration
iAmMichaelConnor
approved these changes
Jun 1, 2025
noir-projects/noir-protocol-circuits/crates/rollup-lib/src/abis/block_root_rollup_data.nr
Show resolved
Hide resolved
… mw/blob-batching-bls-utils-ts
… mw/blob-batching-bls-utils-ts
…nto mw/blob-batching-integration
MirandaWood
added a commit
that referenced
this pull request
Jun 3, 2025
Ts only blob batching methods plus tests. Points to the parent methods PR: #13583. TODOs (Marked in files as `TODO(MW)`): - [ ] Remove the large trusted setup file? Not sure if it's required, but it is currently the only way I show in tests that our BLS12 methods match those in c-kzg. - [x] Add nr fixture where we can use `updateInlineTestData` for point compression. Other TODOs must wait until we actually integrate batching, otherwise I will break the repo. NB: The files `bls12_fields.ts` and `bls12_point.ts` and their tests are essentially copies of `./fields.ts` and `./point.ts`. When reviewing please keep that in mind and double check the original file if you see an issue before commenting (@iAmMichaelConnor ;) ). --- ## PR Stack - [ ] `mw/blob-batching` <- main feature - [ ] ^ `mw/blob-batching-bls-utils` <- BLS12-381 bigcurve and bignum utils (noir) (#13583) - [x] ^ `mw/blob-batching-bls-utils-ts` <- BLS12-381 bigcurve and bignum utils (ts) (#13606) - [ ] ^ `mw/blob-batching-integration` <- Integrate batching into noir protocol circuits (#13817) - [ ] ^ `mw/blob-batching-integration-ts-sol` <- Integrate batching into ts and solidity (#14329)
Base automatically changed from
mw/blob-batching-bls-utils-ts
to
mw/blob-batching
June 3, 2025 09:19
…atching-integration
MirandaWood
added a commit
that referenced
this pull request
Jun 4, 2025
## Finalises integration of batched blobs `mw/blob-batching-integration` adds batching to the rollup .nr circuits only (=> will not run in the repo). This PR brings those changes downstream to the typescript and L1 contracts. Main changes: - L1 Contracts: - No longer calls the point evaluation precompile on `propose`, instead injects the blob commitments, check they correspond to the broadcast blobs, and stores them in the `blobCommitmentsHash` - Does not store any blob public inputs apart from the `blobCommitmentsHash` (no longer required) - Calls the point evaluation precompile once on `submitEpochRootProof` for ALL blobs in the epoch - Uses the same precompile inputs as pubic inputs to the root proof verification alonge with the `blobCommitmentsHash` to link the circuit batched blob, real L1 blobs, and the batched blob verified on L1 - Refactors mock blob oracle - Injects the final blob challenges used on each blob into all block building methods in `orchestrator` - Accumulates blobs in ts when building blocks and uses as inputs to each rollup circuit - Returns the blob inputs required for `submitEpochRootProof` on `finaliseEpoch()` - Updates nr structs in ts plus fixtures and tests ## TODOs/Current issues - ~When using real proofs (e.g. `yarn-project/prover-client/src/test/bb_prover_full_rollup.test.ts`), the root rollup proof is generated correctly but fails verification checks in `bb` due to incorrect number of public inputs. Changing the number correctly updates vks and all constants elsewhere, but `bb` does not change.~ EDIT: solved - must include the `is_inf` point member for now (see below TODO) - ~The `Prover.toml` for block-root is not executing. The error manifests in the same way as that in #12540 (but may be different).~ EDIT: temporarily fixed - details in this repro (#14381) and noir issue (noir-lang/noir#8563). - BLS points in noir take up 9 fields (4 for each coordinate as a limbed bignum, 1 for the `is_inf` flag) but can be compressed to only 2. For recursive verification in block root and above, would it be worth the gates to compress these? It depends whether the gate cost of compression is more/less than gate cost of recursively verifying 7 more public inputs. ## PR Stack - [ ] `mw/blob-batching` <- main feature - [ ] ^ `mw/blob-batching-bls-utils` <- BLS12-381 bigcurve and bignum utils (noir) (#13583) - [ ] ^ `mw/blob-batching-bls-utils-ts` <- BLS12-381 bigcurve and bignum utils (ts) (#13606) - [ ] ^ `mw/blob-batching-integration` <- Integrate batching into noir protocol circuits (#13817) - [x] ^ `mw/blob-batching-integration-ts-sol` <- Integrate batching into ts and solidity (#14329) --------- Co-authored-by: Tom French <15848336+TomAFrench@users.noreply.github.com>
github-merge-queue bot
pushed a commit
that referenced
this pull request
Jun 9, 2025
## The blobs are back in town. This PR reworks blobs so that instead of calling the point evaluation precompile for each blob (currently up to 3 per block => up to 96 (?) calls per epoch), we call it once per epoch by batching blobs to a single kzg commitment, opening, challenge, and proof. How we can be sure that this one pairing check is equivalent to a check per blob is covered in the maths by @iAmMichaelConnor [here](https://hackmd.io/WUtNusQxS5KAw-af3gxycA?view) 🎉 ## Overview Instead of pushing to a long array of `BlobPublicInputs`, which are then individually checked on L1, we batch each blob together to a single set of `BlobAccumulatorPublicInputs`. The `start` accumulator state is fed into each block root circuit, where the block's blobs are accumulated and the `end` state is set. Each block merge circuit checks that the state follows on correctly and, finally, the root circuit checks that the very `start` state was empty and finalises the last `end` state. This last `end` state makes up the set of inputs for the point evaluation precompile. If the pairing check in that precompile passes, we know that all blobs for all blocks in the epoch are valid and contain only the tx effects validated by the rollup. ### Circuits Key changes: - Integrate BLS12-381 curve operations with `bignum` and `bigcurve` libraries, plus tests. - Rework the `blob` package to batch blobs and store in reworked structs, plus tests. - Rework the rollup circuits from `block_root` above to handle blob accumulation state rather than a list of individual blob inputs, plus (you guessed it) tests. ### Contracts The contracts: - No longer call the point evaluation precompile on `propose`, instead inject the blob commitments, check they correspond to the broadcast blobs, and stores them in the `blobCommitmentsHash`. - Do not store any blob public inputs apart from the `blobCommitmentsHash`. - Call the point evaluation precompile once on `submitEpochRootProof` for ALL blobs in the epoch. - Use the same precompile inputs as pubic inputs to the root proof verification along with the `blobCommitmentsHash` to link the circuit batched blob, real L1 blobs, and the batched blob verified on L1. ### TypeScript Key changes: - Edit all the structs and methods reliant on the circuits/contracts to match the above changes. - Inject the final blob challenges used on each blob into all block building methods in `orchestrator`. - Accumulate blobs in ts when building blocks and use as inputs to each rollup circuit, plus tests. - Return the blob inputs required for `submitEpochRootProof` on `finaliseEpoch()`. ### TODOs/Related Issues - Choose field for hashing challenge: #13608 - Instead of exponentiating `gamma` (expensive!), hash it for each iteration: #13740 - Number of public inputs: BLS points in noir take up 9 fields (4 for each coordinate as a limbed bignum, 1 for the is_inf flag) but can be compressed to only 2. For recursive verification in block root and above, would it be worth the gates to compress these? It depends whether the gate cost of compression is more/less than gate cost of recursively verifying 7 more public inputs. - Remove the large trusted setup file from `yarn-project/blob-lib/src/trusted_setup_bit_reversed.json`? Used in testing, but may not be worth keeping (see code comments). - Cleanup old, unused blob stuff in #14637. ## PR Stack - [x] `mw/blob-batching` <- main feature - [x] ^ `mw/blob-batching-bls-utils` <- BLS12-381 bigcurve and bignum utils (noir) (#13583) - [x] ^ `mw/blob-batching-bls-utils-ts` <- BLS12-381 bigcurve and bignum utils (ts) (#13606) - [x] ^ `mw/blob-batching-integration` <- Integrate batching into noir protocol circuits (#13817) - [x] ^ `mw/blob-batching-integration-ts-sol` <- Integrate batching into ts and solidity (#14329) - [ ] ^ `mw/blob-batching-cleanup` <- Remove old blob code --------- Co-authored-by: Tom French <15848336+TomAFrench@users.noreply.github.com>
danielntmd
pushed a commit
to danielntmd/aztec-packages
that referenced
this pull request
Jul 16, 2025
## The blobs are back in town. This PR reworks blobs so that instead of calling the point evaluation precompile for each blob (currently up to 3 per block => up to 96 (?) calls per epoch), we call it once per epoch by batching blobs to a single kzg commitment, opening, challenge, and proof. How we can be sure that this one pairing check is equivalent to a check per blob is covered in the maths by @iAmMichaelConnor [here](https://hackmd.io/WUtNusQxS5KAw-af3gxycA?view) 🎉 ## Overview Instead of pushing to a long array of `BlobPublicInputs`, which are then individually checked on L1, we batch each blob together to a single set of `BlobAccumulatorPublicInputs`. The `start` accumulator state is fed into each block root circuit, where the block's blobs are accumulated and the `end` state is set. Each block merge circuit checks that the state follows on correctly and, finally, the root circuit checks that the very `start` state was empty and finalises the last `end` state. This last `end` state makes up the set of inputs for the point evaluation precompile. If the pairing check in that precompile passes, we know that all blobs for all blocks in the epoch are valid and contain only the tx effects validated by the rollup. ### Circuits Key changes: - Integrate BLS12-381 curve operations with `bignum` and `bigcurve` libraries, plus tests. - Rework the `blob` package to batch blobs and store in reworked structs, plus tests. - Rework the rollup circuits from `block_root` above to handle blob accumulation state rather than a list of individual blob inputs, plus (you guessed it) tests. ### Contracts The contracts: - No longer call the point evaluation precompile on `propose`, instead inject the blob commitments, check they correspond to the broadcast blobs, and stores them in the `blobCommitmentsHash`. - Do not store any blob public inputs apart from the `blobCommitmentsHash`. - Call the point evaluation precompile once on `submitEpochRootProof` for ALL blobs in the epoch. - Use the same precompile inputs as pubic inputs to the root proof verification along with the `blobCommitmentsHash` to link the circuit batched blob, real L1 blobs, and the batched blob verified on L1. ### TypeScript Key changes: - Edit all the structs and methods reliant on the circuits/contracts to match the above changes. - Inject the final blob challenges used on each blob into all block building methods in `orchestrator`. - Accumulate blobs in ts when building blocks and use as inputs to each rollup circuit, plus tests. - Return the blob inputs required for `submitEpochRootProof` on `finaliseEpoch()`. ### TODOs/Related Issues - Choose field for hashing challenge: AztecProtocol#13608 - Instead of exponentiating `gamma` (expensive!), hash it for each iteration: AztecProtocol#13740 - Number of public inputs: BLS points in noir take up 9 fields (4 for each coordinate as a limbed bignum, 1 for the is_inf flag) but can be compressed to only 2. For recursive verification in block root and above, would it be worth the gates to compress these? It depends whether the gate cost of compression is more/less than gate cost of recursively verifying 7 more public inputs. - Remove the large trusted setup file from `yarn-project/blob-lib/src/trusted_setup_bit_reversed.json`? Used in testing, but may not be worth keeping (see code comments). - Cleanup old, unused blob stuff in AztecProtocol#14637. ## PR Stack - [x] `mw/blob-batching` <- main feature - [x] ^ `mw/blob-batching-bls-utils` <- BLS12-381 bigcurve and bignum utils (noir) (AztecProtocol#13583) - [x] ^ `mw/blob-batching-bls-utils-ts` <- BLS12-381 bigcurve and bignum utils (ts) (AztecProtocol#13606) - [x] ^ `mw/blob-batching-integration` <- Integrate batching into noir protocol circuits (AztecProtocol#13817) - [x] ^ `mw/blob-batching-integration-ts-sol` <- Integrate batching into ts and solidity (AztecProtocol#14329) - [ ] ^ `mw/blob-batching-cleanup` <- Remove old blob code --------- Co-authored-by: Tom French <15848336+TomAFrench@users.noreply.github.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
WIP
TODOs
blob.nrfiles and removepubs w/o batching (will do this later so it's easier to review)RootRollupPublicInputsso it doesn't contain unnecessary values not needed for L1 verificationPR Stack
mw/blob-batching<- main featuremw/blob-batching-bls-utils<- BLS12-381 bigcurve and bignum utils (noir) (feat: blob batching methods #13583)mw/blob-batching-bls-utils-ts<- BLS12-381 bigcurve and bignum utils (ts) (feat: blob batching methods (ts) #13606)mw/blob-batching-integration<- Integrate batching into noir protocol circuits (feat: WIP batch blobs and validate in rollup #13817)mw/blob-batching-integration-ts-sol<- Integrate batching into ts and solidity (feat: WIP integrate batched blobs into l1 contracts + ts #14329)