Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

op-batcher: prevent SpanChannelOut RLP bytes overflowing MaxRLPBytesPerChannel #14310

Conversation

OlgaYang
Copy link
Contributor

Description

Noticed that when the channel RLP bytes exceed MaxRLPBytesPerChannel, the batcher does not remove the excess batch. This causes the OP-Node to encounter an error.

msg="failed to read batch from channel reader, skipping to next channel now" err="rlp: value size exceeds available input length"

Tests

I added the test testSpanChannelOut_ExceedMaxRLPBytesPerChannel to ensure that the batcher removes the excess batch when the channel RLP bytes exceed MaxRLPBytesPerChannel.

Additional context

Batcher Log
lvl=info msg="Channel closed" input_bytes=10043790 output_bytes=74565 full_reason="channel full: could not take 10043790 bytes as replacement of channel of 9925704 bytes, max is 10000000. err: batch would cause RLP bytes to go over limit" compr_ratio=0.007423990346273668

Op-Node Derivation Log
msg="failed to read batch from channel reader, skipping to next channel now" err="rlp: value size exceeds available input length"

Metadata

@OlgaYang OlgaYang requested review from a team as code owners February 11, 2025 05:42
@OlgaYang OlgaYang requested a review from ajsutton February 11, 2025 05:42
@geoknee geoknee added the A-op-batcher Area: op-batcher label Feb 11, 2025
@geoknee
Copy link
Contributor

geoknee commented Feb 11, 2025

/ci authorize 0733746

@geoknee geoknee self-requested a review February 12, 2025 14:52
@geoknee geoknee self-assigned this Feb 13, 2025
Copy link
Contributor

@geoknee geoknee left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this fix. Have a few tweaks I would like and then we can get this in!

@OlgaYang
Copy link
Contributor Author

OlgaYang commented Feb 14, 2025

Hi Geoknee,
I have done all the feedback above,
especially this one: "Why don't we do a 'fresh compression' here as we do below?"

I had missed performing a fresh compression, which is an important part of the process.
Thank you for catching that!

@geoknee
Copy link
Contributor

geoknee commented Feb 14, 2025

/ci authorize 63cf3d7

@geoknee
Copy link
Contributor

geoknee commented Feb 14, 2025

Thanks -- couple of small items outstanding. Are you able to address those?

@OlgaYang
Copy link
Contributor Author

OlgaYang commented Feb 14, 2025

Sorry, may I know which small outstanding items I missed?

@geoknee
Copy link
Contributor

geoknee commented Feb 14, 2025

Sorry, may I know which small outstanding items I missed?

The unresolved conversations above. One require message and one variable name.

@OlgaYang
Copy link
Contributor Author

I am done with the feedback above.

@geoknee
Copy link
Contributor

geoknee commented Feb 14, 2025

/ci authorize 2ee7a30

@geoknee geoknee changed the title Fix batcher packing channel that exceed MaxRLPBytesPerChannel in RLP bytes op-batcher: Prevent SpanChannelOut RLP bytes overflowing MaxRLPBytesPerChannel Feb 14, 2025
@geoknee geoknee changed the title op-batcher: Prevent SpanChannelOut RLP bytes overflowing MaxRLPBytesPerChannel op-batcher: prevent SpanChannelOut RLP bytes overflowing MaxRLPBytesPerChannel Feb 14, 2025
@geoknee geoknee enabled auto-merge February 14, 2025 15:22
@geoknee geoknee added this pull request to the merge queue Feb 14, 2025
Merged via the queue into ethereum-optimism:develop with commit dcce927 Feb 14, 2025
47 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-op-batcher Area: op-batcher
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants