-
Notifications
You must be signed in to change notification settings - Fork 376
Feature request: allow specifying the size of the chunks provided by File#createReadStream() #860
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@Dinduks Thank you for reporting this. Environment details
Please share a code snippet that causes the issue.I've tried to reproduce in a couple of ways (400mb):
|
@AVaksman Hi Alex. Indeed, there is no issue with @google-cloud/storage itself. The memory issues come from the thousands of operations inside my I think being able to grab chunks bigger than 16kb would solve this problem by reducing the number of operations inside Meanwhile, I'll try to optimize those operations. I could provide the sample if you think it's relevant. Also:
|
|
Ok @jkwlui. |
Would you mind providing us with a simple reproduction code that we can try and replicate? You mentioned using |
@Dinduks please feel free to provide more information, but I believe this is an issue to solve at the application level as opposed to our level in the library. You could look into using something like throttle, or otherwise queueing the operations you're running in the I'm going to close the issue, but I'm happy to re-open if there's more we should do on our side. |
Hello,
At the moment, the chunks returned by
File#createReadStream()
method are 16kb long.I believe that's Node's default
highWaterMark
value.The issue with this is that it consumes too much memory to read a big file.
For a 250mb file, that's 16k chunks, and it consumes 2GB of memory before the process crashes.
If you are okay with it, I'd like to submit a PR to specify the chunks size.
I will look into
createReadStream()
code tonight and see what I can do.Meanwhile, any help or guidance is welcome.
Thanks,
Samy
The text was updated successfully, but these errors were encountered: