You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Would it be possible to add an “Estimated time” value for the downloads? If not, what about a “Total download dimension: xxx MiB” value, so that we can roughly estimate the time? Thank you.
The text was updated successfully, but these errors were encountered:
Unfortunately the program load pages number and download files sequentially, so it does not know a maximum for the “size” to download which could be used to estimate a "real" total download time. The only things it knows is the number of chapters to download, I could create a function to estimate the total time on that value but it would be very inaccurate (since every page is an image with very different sizes). Otherwise it should load all the pages first, download their metadata to get the size and later proceding with the real download, but this would increase the total download time, I don't know if it's worth. I'll think about this
Original report by Gambling Man (Bitbucket: [Gambling Man](https://bitbucket.org/Gambling Man), ).
Would it be possible to add an “Estimated time” value for the downloads? If not, what about a “Total download dimension: xxx MiB” value, so that we can roughly estimate the time? Thank you.
The text was updated successfully, but these errors were encountered: