-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test ROS3 streaming in CI and add ROS3 tutorial #1393
Merged
Changes from all commits
Commits
Show all changes
38 commits
Select commit
Hold shift + click to select a range
21258a7
Test ROS3 CI
rly 77d6743
Run the new job
rly 83cd67d
Fixes
rly 7bdcaf8
Try bypassing tox+conda
rly ef222f5
Use h5py>=3.2
rly c0201ec
Second try
rly 4b054a1
Different approach, pull out ros3 requirements
rly 4f790b2
Testing
rly be5bca7
Conda env fix
rly bfd0ceb
CI fix
rly d245646
CI fix
rly 04208e2
Finish CircleCI config, set up Azure
rly 65be04f
Fix Azure config
rly 22f117f
Add conda to Azure CI system path
rly f530a48
Add OS-specific commands to Azure
rly 2879e55
Activate conda env
rly 9d27237
Fix Azure
rly 563b9ee
Fix Azure
rly 4bc1331
Fix Windows Azure
rly 60c409b
Fix typo
rly 5d39042
Remove ROS3 test on Windows/MacOS on every PR
rly 673727f
Update changelog, add basic tutorial
rly b4c0dc8
Move dandi helper functions to pynwb.utils
rly 444f939
Update tutorial
rly b7ae928
Add testing for dandi utils
rly 7effced
Use a test file from dandi s3 instead of actual data
rly 226313c
add thumbnail for streaming tutorial
oruebel b032b6e
Use NWB test dandiset
rly 656227c
Merge branch 'ci/ros3' of https://github.com/NeurodataWithoutBorders/…
rly 1448b4e
update to use the latest dandi tools
bendichter 1677f91
Cleanup, remove references to requests package
rly 781277d
Update docs
rly c305176
Update docs
rly 1684074
Update docs
rly 5447927
Fix docs
rly c0b21c3
Clean up ref to removed file
rly 57cf80b
Update changelog
rly 772236b
Merge branch 'ci/ros3' of https://github.com/NeurodataWithoutBorders/…
rly File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,55 @@ | ||
''' | ||
Streaming from an S3 Bucket | ||
=========================== | ||
|
||
Using PyNWB 2, HDMF 3, and h5py 3.2+, you can now stream data from an NWB file stored in an S3 bucket, such as data | ||
from the `DANDI Archive <https://dandiarchive.org/>`_. This is especially useful for reading small pieces of data | ||
from a large NWB file stored remotely. | ||
''' | ||
|
||
#################### | ||
# Streaming data from an S3 bucket requires having HDF5 installed with the ROS3 (read-only S3) driver. | ||
# You can install HDF5 with the ROS3 driver from `conda-forge <https://conda-forge.org/>`_ using ``conda``. | ||
# You may first need to uninstall a currently installed version of h5py. | ||
# | ||
# .. code-block:: bash | ||
# | ||
# pip uninstall h5py | ||
# conda install -c conda-forge "h5py>=3.2" | ||
# | ||
|
||
#################### | ||
# Next, use the ``DandiAPIClient`` to get the S3 URL to an NWB file of interest stored in the DANDI Archive. | ||
# If you have not already, install the latest release of the ``dandi`` package. | ||
# | ||
# .. code-block:: bash | ||
# | ||
# pip install dandi | ||
# | ||
# .. code-block:: python | ||
# | ||
# from dandi.dandiapi import DandiAPIClient | ||
# | ||
# dandiset_id = '000006' # ephys dataset from the Svoboda Lab | ||
# filepath = 'sub-anm372795/sub-anm372795_ses-20170718.nwb' # 450 kB file | ||
# with DandiAPIClient() as client: | ||
# asset = client.get_dandiset(dandiset_id, 'draft').get_asset_by_path(filepath) | ||
# s3_path = asset.get_content_url(follow_redirects=1, strip_query=True) | ||
|
||
#################### | ||
# Finally, instantiate a :py:class:`~pynwb.NWBHDF5IO` object with the S3 URL and specify the driver as "ros3". This | ||
# will download metadata about the file from the S3 bucket to memory. The values of datasets are accessed lazily, | ||
# just like when reading an NWB file stored locally. So, slicing into a dataset will require additional time to | ||
# download the sliced data (and only the sliced data) to memory. | ||
# | ||
# .. code-block:: python | ||
# | ||
# from pynwb import NWBHDF5IO | ||
# | ||
# with NWBHDF5IO(s3_path, mode='r', load_namespaces=True, driver='ros3') as io: | ||
# nwbfile = io.read() | ||
# print(nwbfile) | ||
# print(nwbfile.acquisition['lick_times'].time_series['lick_left_times'].data[:]) | ||
# | ||
|
||
# sphinx_gallery_thumbnail_path = 'figures/gallery_thumbnails_streaming.png' |
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,13 @@ | ||
# pinned dependencies to reproduce an entire development environment to use PyNWB with ROS3 support | ||
name: ros3 | ||
channels: | ||
- conda-forge | ||
- defaults | ||
dependencies: | ||
- python=3.9 | ||
- h5py==3.3.0 | ||
- hdmf==3.1.1 | ||
- numpy==1.21.0 | ||
- pandas==1.3.0 | ||
- python-dateutil==2.8.1 | ||
- setuptools |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
File renamed without changes.
Empty file.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,23 @@ | ||
from pynwb import NWBHDF5IO | ||
from pynwb.testing import TestCase | ||
|
||
|
||
class TestRos3Streaming(TestCase): | ||
# requires h5py to be built with the ROS3 driver: conda install -c conda-forge h5py | ||
|
||
def test_read(self): | ||
s3_path = 'https://dandiarchive.s3.amazonaws.com/ros3test.nwb' | ||
|
||
with NWBHDF5IO(s3_path, mode='r', load_namespaces=True, driver='ros3') as io: | ||
nwbfile = io.read() | ||
test_data = nwbfile.acquisition['ts_name'].data[:] | ||
self.assertEqual(len(test_data), 3) | ||
|
||
def test_dandi_read(self): | ||
# this is the NWB Test Data dandiset #000126 sub-1/sub-1.nwb | ||
s3_path = 'https://dandiarchive.s3.amazonaws.com/blobs/11e/c89/11ec8933-1456-4942-922b-94e5878bb991' | ||
|
||
with NWBHDF5IO(s3_path, mode='r', load_namespaces=True, driver='ros3') as io: | ||
nwbfile = io.read() | ||
test_data = nwbfile.acquisition['TestData'].data[:] | ||
self.assertEqual(len(test_data), 3) |
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a reason for using a context-manager here on read. I'm just wondering since we normally tend to use contexts only on write.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is a good question. I had always followed the rule of thumb to use a context manager when opening a file to be safe. It should not make a difference in virtually all cases - the file will be closed on garbage collection and having the file open doesn't stop other processes from reading or modifying the file. However, if someone is using Windows and going to delete the file before garbage collection (or thinks another process might delete it), then the file must be explicitly closed before it can be deleted. So I think to be safe, it is ever so slightly better to use a context manager.
See also:
https://stackoverflow.com/questions/56149237/do-i-need-to-manually-close-a-hdf5-file
https://stackoverflow.com/questions/7395542/is-explicitly-closing-files-important/7395906#7395906