Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SF#147 resuming a multipart upload only skips up to 1000 existing files/parts #346

Closed
mdomsch opened this issue Jun 15, 2014 · 5 comments
Closed
Assignees
Milestone

Comments

@mdomsch
Copy link
Contributor

mdomsch commented Jun 15, 2014

(From https://sourceforge.net/p/s3tools/bugs/147/)
Milestone: Enhancement_request
Status: open
Owner: nobody
Labels: None
Priority: 5
Updated: 3 hours ago
Created: 3 hours ago
Creator: anierman
Private: No

A multipart upload failed after several thousand 15MB files/parts had been uploaded.

After restarting the upload with put and --upload-id only the first 1000 files/parts were skipped due to md5/size matches. Re-uploading began at file/part 1001.

The issue is that list_multipart in S3.py only returns 1000 records at most, see:
http://docs.aws.amazon.com/AmazonS3/latest/API/mpUploadListParts.html

To skip all of the appropriate matches--past 1000--IsTruncated could be detected in the response and appropriate output retrieved in a loop (making use of part-number​-marker).

@mdomsch
Copy link
Contributor Author

mdomsch commented Feb 4, 2015

Same root problem as #414

@BobPusateri
Copy link

Ran across this issue today. Highly annoying. Will use larger upload chunks in the future so hopefully all my files have less than 1000 parts, but this should really be addressed.

@fviard
Copy link
Contributor

fviard commented Jun 16, 2020

@BobPusateri Indeed it is a real issue and it is the one that I'm currently working on, so it should be fixed soon.

@fviard fviard self-assigned this Jun 16, 2020
@fviard fviard added this to the 2.2.0 milestone Jun 16, 2020
@fviard fviard closed this as completed in cefa4b6 Jun 22, 2020
@fviard
Copy link
Contributor

fviard commented Jun 22, 2020

@BobPusateri FYI, this issue is now fixed in MASTER and will be in the next release 2.2.0.

@jtbandes
Copy link

jtbandes commented Mar 1, 2021

Thanks for fixing this! Would be great if it made it into a release at some point 😂 But this fix saved my ~7500-part upload after a power outage. Even cooler that it was able to resume from an upload I started with aws s3 cp. 💖

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants