Skip to content

Conversation

@mrocklin
Copy link
Member

When determining if we should compress an array we take a few samples of length
10000. However, previously this 10000 was the number of elements rather than
the number of bytes. This resulted in odd behavior when an array's size was
less than the sample it was trying to extract.

We have resolved this by counting by bytes rather than elements

When determining if we should compress an array we take a few samples of length
10000.  However, previously this 10000 was the number of elements rather than
the number of bytes.  This resulted in odd behavior when an array's size was
less than the sample it was trying to extract.

We have resolved this by counting by bytes rather than elements
@mrocklin mrocklin merged commit 59b842d into dask:master Jan 25, 2017
@mrocklin mrocklin deleted the numpy-serialize-large-dtypes branch January 25, 2017 21:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant