Skip to content

Conversation

@raunaqmorarka
Copy link
Member

@raunaqmorarka raunaqmorarka commented Aug 18, 2025

Description

Matches batch size used in legacy FS io.trino.hdfs.s3.TrinoS3FileSystem#DELETE_BATCH_SIZE
1000 is the maximum batch size supported by S3 client
Reduces potential of getting throttled by S3

Additional context and related issues

Release notes

( ) This is not user-visible or is docs only, and no release notes are required.
( ) Release notes are required. Please propose a release note for me.
(x) Release notes are required, with the following suggested text:

## Iceberg
* Reduce query failures from deletes on S3 failing due to throttling errors. ({issue}`26432`)

Matches batch size used in legacy FS io.trino.hdfs.s3.TrinoS3FileSystem#DELETE_BATCH_SIZE
1000 is the maximum batch size supported by S3 client
Reduces potential of getting throttled by S3
@raunaqmorarka raunaqmorarka merged commit ae6c57b into master Aug 18, 2025
74 of 76 checks passed
@raunaqmorarka raunaqmorarka deleted the raunaq/delete-batch branch August 18, 2025 10:36
@github-actions github-actions bot added this to the 477 milestone Aug 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Development

Successfully merging this pull request may close these issues.

3 participants