Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix chunking bug with compound dtypes #1146

Draft
wants to merge 4 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .github/workflows/dev-testing.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,11 @@
name: Dev Branch Testing
on:
workflow_dispatch:
inputs:
python-versions:
description: 'List of Python versions to use in matrix, as JSON string'
required: true
type: string
workflow_call:
inputs:
python-versions:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
import h5py
import numcodecs
import numpy as np
import pynwb
import zarr
from hdmf import Container
from hdmf.utils import get_data_shape
Expand Down Expand Up @@ -258,9 +259,24 @@ def from_neurodata_object(cls, neurodata_object: Container, dataset_name: Litera
and `timestamps`, each of which can be configured separately.
"""
location_in_file = _find_location_in_memory_nwbfile(neurodata_object=neurodata_object, field_name=dataset_name)

candidate_dataset = getattr(neurodata_object, dataset_name)
full_shape = get_data_shape(data=candidate_dataset)

manager = pynwb.get_manager()
namespace_catalog = manager.type_map.namespace_catalog
for namespace in namespace_catalog.namespaces:
try:
spec = namespace_catalog.get_spec(namespace, neurodata_object.parent.neurodata_type)
break
except ValueError:
continue
spec = spec.get_dataset(neurodata_object.name)
spec = spec if spec is not None else {}
dtype = spec.get("dtype", None)
if isinstance(dtype, list): # compound dtype
full_shape = (len(candidate_dataset),)
else:
full_shape = get_data_shape(data=candidate_dataset)

dtype = _infer_dtype(dataset=candidate_dataset)

if isinstance(candidate_dataset, GenericDataChunkIterator):
Expand Down
Loading