You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I just stumbled upon the issue that HDF5.jl chokes on datasets which were initialised to 0-size but never resized or populated later.
A workaround is of course to not create datasets which are empty but currently I set-up my data processing in a homogenous way, so that each HDF5 file has the same structure and sometimes a dataset is empty.
Maybe I am doing something wrong and this is not the way to initialise datasets which might be empty, but that's what I found in the HDF5 docs and also in examples.
Here is a MWE, just uncomment the two lines and the error disappears:
using HDF5
fname =tempname()
groupname ="a_group"
dsetname ="foo"h5open(fname, "w") do fid
create_group(fid, groupname)
create_dataset(fid[groupname], dsetname, datatype(Float32), ((0,), (-1,)), chunk=(100,))
# Uncomment these otherwise the datasets are unreadable#dset = fid[groupname][dsetname]#HDF5.set_extent_dims(dset, (100,))endh5open(fname, "r") do fid
fid[groupname][dsetname][:]
end
Here is the error which indicates that the binary parsing loses track:
UInt64[0x0000000000000000]
UInt64[0x0000000000000001]
UInt64[0x0000000000000000]
(0,)
index out of range
Stacktrace:
[1] error(s::String)
@ Base ./error.jl:33
[2] hyperslab(dspace::HDF5.Dataspace, I::Base.Slice{Base.OneTo{Int64}})
@ HDF5 ~/.julia/packages/HDF5/0iEnL/src/HDF5.jl:1691
[3] generic_read(obj::HDF5.Dataset, filetype::HDF5.Datatype, #unused#::Type{Float32}, I::Function)
@ HDF5 ~/.julia/packages/HDF5/0iEnL/src/HDF5.jl:1252
[4] getindex(dset::HDF5.Dataset, I::Function)
@ HDF5 ~/.julia/packages/HDF5/0iEnL/src/HDF5.jl:1188
[5] (::var"#77#78")(fid::HDF5.File)
@ Main ./In[28]:17
[6] h5open(::var"#77#78", ::String, ::Vararg{String, N} where N; swmr::Bool, pv::Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ HDF5 ~/.julia/packages/HDF5/0iEnL/src/HDF5.jl:458
[7] h5open(::Function, ::String, ::String)
@ HDF5 ~/.julia/packages/HDF5/0iEnL/src/HDF5.jl:456
[8] top-level scope
@ In[28]:16
[9] eval
@ ./boot.jl:360 [inlined]
[10] include_string(mapexpr::typeof(REPL.softscope), mod::Module, code::String, filename::String)
@ Base ./loading.jl:1094
The text was updated successfully, but these errors were encountered:
I just stumbled upon the issue that
HDF5.jl
chokes on datasets which were initialised to 0-size but never resized or populated later.A workaround is of course to not create datasets which are empty but currently I set-up my data processing in a homogenous way, so that each HDF5 file has the same structure and sometimes a dataset is empty.
Maybe I am doing something wrong and this is not the way to initialise datasets which might be empty, but that's what I found in the HDF5 docs and also in examples.
Here is a MWE, just uncomment the two lines and the error disappears:
Here is the error which indicates that the binary parsing loses track:
The text was updated successfully, but these errors were encountered: