-
Notifications
You must be signed in to change notification settings - Fork 361
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to pipe a writeable stream? #472
Comments
Hi @jz222 , Why do you want to remove the loop? It's just a loop and won't take any extra memory or anything. You can use the functional Array method dataArray.forEach(obj => asyncParser.input.push(obj));
asyncParser.input.push(null); #468 will allow you to do asyncParser
.from(dataArray)
.to(writableStream); But won't be in until v6. In any case, where are you getting |
Might be a good example showing a writeable stream. Maybe a recipes section would be nice in docs. |
Thanks for getting back to me @juanjoDiaz. We are working on an integration platform. The Thank you for your time. |
No problem. If you are integrating data from various sources and doing data manipulations, I'd really recommend that you look into doing everything with streams since they allow you to keep the memory usage under control regardless of the amount of data that you are processing. But that's just my advice that no one asked for 😅 I'll close this issue. Feel free to reopen if you feel that there is something still to be answered. |
Hi,
I would like to pipe the json2csv parser to a writeable stream to directly write the CSV file to a bucket without blocking the event loop, as we are processing rather large sets of data. I came up with the solution below. However, I was hoping there is a more elegant solution. I already have the data as an array of objects. Each object represents one row in the CSV file. What I don't like is that I loop through the data array and I was wondering how this can be avoided.
The text was updated successfully, but these errors were encountered: