-
-
Notifications
You must be signed in to change notification settings - Fork 907
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
--max-retries #914
Comments
I don't think that a lot of people would use this option but that looks ok to me to add that in config options. |
It would be great to have this feature in case of failure! |
A I use an S3-compatible private cloud which I access over a less-than-reliable connection that is extremely slow in relation to the size of the data I'm sending (> 1TiB over 1.25 MB/s, which takes weeks or more). Meanwhile, the server's network connection goes down almost weekly. Currently, I'm willing to try coding up a solution once I have clarity on what kind of Pull Request you'd accept. I was thinking of adding a concept of "retry profiles" in the config file: retry_profile = normal, retry_profile = aggressive, etc., and tuning the client's behavior accordingly. |
A "--max-retries" option was added in MASTER. |
Currently the retries are hard-coded to 5.
It'd help if it was possible to control the number of retries, especially when we're throttling our traffic and we're trying to upload big files.
If it was possible to set the value to somewhere in the range of 1 to 30 I think it can be reasonable in some cases.
The text was updated successfully, but these errors were encountered: