-
Notifications
You must be signed in to change notification settings - Fork 599
feat: blob batching methods (ts) #13606
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
59 commits
Select commit
Hold shift + click to select a range
97e2a01
feat: blob batching methods - nr only
MirandaWood 52801e6
chore: fmt, more tests, rearranging
MirandaWood 35e8be2
feat: BLS12 field, curve methods, blob batching methods, ts only
MirandaWood db03339
chore: lint, cleanup
MirandaWood 68be71e
chore: remove trusted setup file + test using it (size issues)
MirandaWood 1fa5d49
Revert "chore: remove trusted setup file + test using it (size issues)"
MirandaWood 33d62a7
chore: cleanup packages + increase playground size
MirandaWood 46b8866
feat: address some comments, cleanup
MirandaWood 346ca9a
chore: update some comments
MirandaWood f655bf5
Merge remote-tracking branch 'origin/mw/blob-batching' into mw/blob-b…
MirandaWood 8d57216
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood 9490422
chore: renaming, cleanup
MirandaWood c300c77
chore: renaming, cleanup
MirandaWood 63ffe96
chore: add issue nums (hopefully force ci cache reset)
MirandaWood 606a942
feat: as isNegative to F, rename proof -> Q
MirandaWood 75d6d35
Merge remote-tracking branch 'origin/mw/blob-batching' into mw/blob-b…
MirandaWood 40ffd8b
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood 951d329
chore: bumped vite kb limit 1700 -> 1720
MirandaWood 307cb09
chore: bumped vite kb limit 1700 -> 1750
MirandaWood 5251cd1
feat: adding helpers, constants, docs, etc. for integration
MirandaWood de8ec1e
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood ca0da9d
Merge remote-tracking branch 'origin/mw/blob-batching' into mw/blob-b…
MirandaWood d5f11d4
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood d3ac058
chore: rename v -> blob_commitments_hash, move noir ref further up stack
MirandaWood 6750476
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood 8e968ac
chore: rename v to blobCommitmentsHash
MirandaWood 1fc2c0d
Merge remote-tracking branch 'origin/mw/blob-batching' into mw/blob-b…
MirandaWood cda163a
chore: use updated methods from bignum, remove warnings
MirandaWood 4806435
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood 49c7be3
chore: switch bigcurve branch to remove visibility warnings
MirandaWood ee900fe
Merge remote-tracking branch 'origin/mw/blob-batching' into mw/blob-b…
MirandaWood 2216519
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood 55bc974
Merge remote-tracking branch 'origin/mw/blob-batching' into mw/blob-b…
MirandaWood df48994
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood dc1bd18
chore: fmt
MirandaWood cb146ae
fix: include is_inf in all serialization so recursion works
MirandaWood 18db30a
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood 3398526
chore: update import
MirandaWood e0f687a
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood 5f02ae1
feat: add point compression unit test
MirandaWood 480b8de
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood 51899bd
chore: add fixture test for point compression, bring down new bls met…
MirandaWood 4c5c437
Merge remote-tracking branch 'origin/mw/blob-batching' into mw/blob-b…
MirandaWood 1b7fbf0
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood 8667c5c
Merge remote-tracking branch 'origin/mw/blob-batching' into mw/blob-b…
MirandaWood ff52662
chore: bump bignum
MirandaWood 0c91085
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood 3064028
feat: address some comments
MirandaWood b358b3e
chore: test using toEqual in jest
MirandaWood fb8e45a
feat: init bigint and buffer, remove static compress
MirandaWood c75232b
feat: replace empty blob assumption
MirandaWood 9e80ff2
feat: address some comments
MirandaWood 2d1f35e
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood ea19acd
chore: add extra check before blob acc init
MirandaWood e89fd4a
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood 92f6e5f
chore: renaming, bring down changes from integration branch, cleanup
MirandaWood 88b4b28
Merge remote-tracking branch 'origin/mw/blob-batching-bls-utils' into…
MirandaWood 0ee11fd
chore: cleanup, bring down changes from other PRs
MirandaWood f8ea4d2
Merge remote-tracking branch 'origin/mw/blob-batching' into mw/blob-b…
MirandaWood File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,186 @@ | ||
| import { BLOBS_PER_BLOCK, FIELDS_PER_BLOB } from '@aztec/constants'; | ||
| import { fromHex } from '@aztec/foundation/bigint-buffer'; | ||
| import { poseidon2Hash, randomBigInt, sha256ToField } from '@aztec/foundation/crypto'; | ||
| import { BLS12Fr, BLS12Point, Fr } from '@aztec/foundation/fields'; | ||
| import { fileURLToPath } from '@aztec/foundation/url'; | ||
|
|
||
| import cKzg from 'c-kzg'; | ||
| import { readFileSync } from 'fs'; | ||
| import { dirname, resolve } from 'path'; | ||
|
|
||
| import { BatchedBlob, Blob } from './index.js'; | ||
|
|
||
| // TODO(MW): Remove below file and test? Only required to ensure commiting and compression are correct. | ||
| const trustedSetup = JSON.parse( | ||
| readFileSync(resolve(dirname(fileURLToPath(import.meta.url)), 'trusted_setup_bit_reversed.json')).toString(), | ||
| ); | ||
|
|
||
| // Importing directly from 'c-kzg' does not work: | ||
| const { FIELD_ELEMENTS_PER_BLOB, computeKzgProof, loadTrustedSetup, verifyKzgProof } = cKzg; | ||
|
|
||
| try { | ||
| loadTrustedSetup(); | ||
| } catch (error: any) { | ||
| if (error.message.includes('trusted setup is already loaded')) { | ||
| // NB: The c-kzg lib has no way of checking whether the setup is loaded or not, | ||
| // and it throws an error if it's already loaded, even though nothing is wrong. | ||
| // This is a rudimentary way of ensuring we load the trusted setup if we need it. | ||
| } else { | ||
| throw new Error(error); | ||
| } | ||
| } | ||
|
|
||
| describe('blob', () => { | ||
| it.each([10, 100, 400])('our BLS library should correctly commit to a blob of %p items', async size => { | ||
| const blobItems: Fr[] = Array(size).fill(new Fr(size + 1)); | ||
| const ourBlob = await Blob.fromFields(blobItems); | ||
|
|
||
| const point = BLS12Point.decompress(ourBlob.commitment); | ||
|
|
||
| // Double check we correctly decompress the commitment | ||
| const recompressed = point.compress(); | ||
| expect(recompressed.equals(ourBlob.commitment)).toBeTruthy(); | ||
|
|
||
| let commitment = BLS12Point.ZERO; | ||
| const setupG1Points: BLS12Point[] = trustedSetup['g1_lagrange_bit_reversed'] | ||
| .slice(0, size) | ||
| .map((s: string) => BLS12Point.decompress(fromHex(s))); | ||
|
|
||
| setupG1Points.forEach((p, i) => { | ||
| commitment = commitment.add(p.mul(BLS12Fr.fromBN254Fr(blobItems[i]))); | ||
| }); | ||
|
|
||
| expect(commitment.equals(point)).toBeTruthy(); | ||
| }); | ||
|
|
||
| it('should construct and verify a batched blob of 400 items', async () => { | ||
| // Initialise 400 fields. This test shows that a single blob works with batching methods. | ||
| // The values here are used to test Noir's blob evaluation in noir-projects/noir-protocol-circuits/crates/blob/src/blob_batching.nr -> test_400_batched | ||
| const blobItems = Array(400).fill(new Fr(3)); | ||
| const blobs = await Blob.getBlobs(blobItems); | ||
|
|
||
| // Challenge for the final opening (z) | ||
| const zis = blobs.map(b => b.challengeZ); | ||
| const finalZ = zis[0]; | ||
|
|
||
| // 'Batched' commitment | ||
| const commitments = blobs.map(b => BLS12Point.decompress(b.commitment)); | ||
|
|
||
| // 'Batched' evaluation | ||
| const proofObjects = blobs.map(b => computeKzgProof(b.data, finalZ.toBuffer())); | ||
| const evalYs = proofObjects.map(p => BLS12Fr.fromBuffer(Buffer.from(p[1]))); | ||
| const qs = proofObjects.map(p => BLS12Point.decompress(Buffer.from(p[0]))); | ||
|
|
||
| // Challenge gamma | ||
| const evalYsToBLSBignum = evalYs.map(y => y.toNoirBigNum()); | ||
| const hashedEvals = await Promise.all(evalYsToBLSBignum.map(e => poseidon2Hash(e.limbs.map(Fr.fromHexString)))); | ||
| const finalGamma = BLS12Fr.fromBN254Fr(await poseidon2Hash([hashedEvals[0], zis[0]])); | ||
|
|
||
| let batchedC = BLS12Point.ZERO; | ||
| let batchedQ = BLS12Point.ZERO; | ||
| let finalY = BLS12Fr.ZERO; | ||
| let powGamma = new BLS12Fr(1n); // Since we start at gamma^0 = 1 | ||
| let finalBlobCommitmentsHash: Buffer = Buffer.alloc(0); | ||
| for (let i = 0; i < blobs.length; i++) { | ||
| const cOperand = commitments[i].mul(powGamma); | ||
| const yOperand = evalYs[i].mul(powGamma); | ||
| const qOperand = qs[i].mul(powGamma); | ||
| batchedC = batchedC.add(cOperand); | ||
| batchedQ = batchedQ.add(qOperand); | ||
| finalY = finalY.add(yOperand); | ||
| powGamma = powGamma.mul(finalGamma); | ||
| finalBlobCommitmentsHash = sha256ToField([finalBlobCommitmentsHash, blobs[i].commitment]).toBuffer(); | ||
| } | ||
|
|
||
| expect(batchedC.equals(commitments[0])).toBeTruthy(); | ||
| expect(finalY.equals(evalYs[0])).toBeTruthy(); | ||
| expect(finalBlobCommitmentsHash.equals(sha256ToField([blobs[0].commitment]).toBuffer())).toBeTruthy(); | ||
|
|
||
| const batchedBlob = await BatchedBlob.batch(blobs); | ||
|
|
||
| expect(batchedC.equals(batchedBlob.commitment)).toBeTruthy(); | ||
| expect(batchedQ.equals(batchedBlob.q)).toBeTruthy(); | ||
| expect(finalZ.equals(batchedBlob.z)).toBeTruthy(); | ||
| expect(finalY.equals(batchedBlob.y)).toBeTruthy(); | ||
| expect(finalBlobCommitmentsHash.equals(batchedBlob.blobCommitmentsHash.toBuffer())).toBeTruthy(); | ||
|
|
||
| const isValid = verifyKzgProof(batchedC.compress(), finalZ.toBuffer(), finalY.toBuffer(), batchedQ.compress()); | ||
| expect(isValid).toBe(true); | ||
| }); | ||
|
|
||
| it('should construct and verify a batch of 3 full blobs', async () => { | ||
| // The values here are used to test Noir's blob evaluation in noir-projects/noir-protocol-circuits/crates/blob/src/blob_batching.nr -> test_full_blobs_batched | ||
| // Initialise enough fields to require 3 blobs | ||
| const items = [new Fr(3), new Fr(4), new Fr(5)].map(f => | ||
| new Array(FIELDS_PER_BLOB).fill(f).map((elt, i) => elt.mul(new Fr(i + 1))), | ||
| ); | ||
| const blobs = await Blob.getBlobs(items.flat()); | ||
|
|
||
| // Challenge for the final opening (z) | ||
| const zis = blobs.map(b => b.challengeZ); | ||
| const finalZ = await poseidon2Hash([await poseidon2Hash([zis[0], zis[1]]), zis[2]]); | ||
|
|
||
| // Batched commitment | ||
| const commitments = blobs.map(b => BLS12Point.decompress(b.commitment)); | ||
|
|
||
| // Batched evaluation | ||
| // NB: we share the same finalZ between blobs | ||
| const proofObjects = blobs.map(b => computeKzgProof(b.data, finalZ.toBuffer())); | ||
| const evalYs = proofObjects.map(p => BLS12Fr.fromBuffer(Buffer.from(p[1]))); | ||
| const qs = proofObjects.map(p => BLS12Point.decompress(Buffer.from(p[0]))); | ||
|
|
||
| // Challenge gamma | ||
| const evalYsToBLSBignum = evalYs.map(y => y.toNoirBigNum()); | ||
| const hashedEvals = await Promise.all(evalYsToBLSBignum.map(e => poseidon2Hash(e.limbs.map(Fr.fromHexString)))); | ||
| const finalGamma = BLS12Fr.fromBN254Fr( | ||
| await poseidon2Hash([ | ||
| await poseidon2Hash([await poseidon2Hash([hashedEvals[0], hashedEvals[1]]), hashedEvals[2]]), | ||
| finalZ, | ||
| ]), | ||
| ); | ||
|
|
||
| let batchedC = BLS12Point.ZERO; | ||
| let batchedQ = BLS12Point.ZERO; | ||
| let finalY = BLS12Fr.ZERO; | ||
| let powGamma = new BLS12Fr(1n); // Since we start at gamma^0 = 1 | ||
| let finalBlobCommitmentsHash: Buffer = Buffer.alloc(0); | ||
| for (let i = 0; i < 3; i++) { | ||
| const cOperand = commitments[i].mul(powGamma); | ||
| const yOperand = evalYs[i].mul(powGamma); | ||
| const qOperand = qs[i].mul(powGamma); | ||
| batchedC = batchedC.add(cOperand); | ||
| batchedQ = batchedQ.add(qOperand); | ||
| finalY = finalY.add(yOperand); | ||
| powGamma = powGamma.mul(finalGamma); | ||
| finalBlobCommitmentsHash = sha256ToField([finalBlobCommitmentsHash, blobs[i].commitment]).toBuffer(); | ||
| } | ||
|
|
||
| const batchedBlob = await BatchedBlob.batch(blobs); | ||
|
|
||
| expect(batchedC.equals(batchedBlob.commitment)).toBeTruthy(); | ||
| expect(batchedQ.equals(batchedBlob.q)).toBeTruthy(); | ||
| expect(finalZ.equals(batchedBlob.z)).toBeTruthy(); | ||
| expect(finalY.equals(batchedBlob.y)).toBeTruthy(); | ||
| expect(finalBlobCommitmentsHash.equals(batchedBlob.blobCommitmentsHash.toBuffer())).toBeTruthy(); | ||
|
|
||
| const isValid = verifyKzgProof(batchedC.compress(), finalZ.toBuffer(), finalY.toBuffer(), batchedQ.compress()); | ||
| expect(isValid).toBe(true); | ||
| }); | ||
|
|
||
| it.each([ | ||
| 3, 5, 10, | ||
| // 32 <- NB Full 32 blocks currently takes around 30s to fully batch | ||
| ])('should construct and verify a batch of blobs over %p blocks', async blocks => { | ||
| const items = new Array(FIELD_ELEMENTS_PER_BLOB * blocks * BLOBS_PER_BLOCK) | ||
| .fill(Fr.ZERO) | ||
| .map((_, i) => new Fr(BigInt(i) + randomBigInt(120n))); | ||
|
|
||
| const blobs = []; | ||
| for (let i = 0; i < blocks; i++) { | ||
| const start = i * FIELD_ELEMENTS_PER_BLOB * BLOBS_PER_BLOCK; | ||
| blobs.push(...(await Blob.getBlobs(items.slice(start, start + FIELD_ELEMENTS_PER_BLOB * BLOBS_PER_BLOCK)))); | ||
| } | ||
| // BatchedBlob.batch() performs a verification check: | ||
| await BatchedBlob.batch(blobs); | ||
| }); | ||
| }); |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.