You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there anyway to read the parquet converted record before writing it into parquet file.
My requirement is to create a parquet file directly into azure data lake without storing it locally.
The text was updated successfully, but these errors were encountered:
Using the parquets package I've been able to stream out parquet data without saving to a temporary file, it supports node streams. It will write out the data one row group at a time, though, so you do need enough memory to hold a full row group. This is a limitation of parquet.
Using the parquets package I've been able to stream out parquet data without saving to a temporary file, it supports node streams. It will write out the data one row group at a time, though, so you do need enough memory to hold a full row group. This is a limitation of parquet.
can you please provide more insight into the code perspective? It will be very helpful.
Is there anyway to read the parquet converted record before writing it into parquet file.
My requirement is to create a parquet file directly into azure data lake without storing it locally.
The text was updated successfully, but these errors were encountered: