Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add/icephys meta #1349

Merged
merged 119 commits into from
Jul 30, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
119 commits
Select commit Hold shift + click to select a range
c2f33f7
Ignore flake8 warning in test_io
oruebel Apr 5, 2021
b68fd8f
Declare SweepTable as deprecated on init
oruebel Apr 5, 2021
b90656a
Add TimeSeriesReferenceVectorData Container class
oruebel Apr 5, 2021
470bc10
Add simple test for TimeSeriesReferenceVectorData
oruebel Apr 5, 2021
f200f76
Update nwb-schema branch
oruebel Apr 6, 2021
a6288d1
Add icephys metadata container classes
oruebel Apr 6, 2021
6849e8a
Update NWBFile to support icephys metadata tables
oruebel Apr 6, 2021
b8b9608
Add unit tests for icephys metadata tables
oruebel Apr 6, 2021
197c454
Update imports and fix SweepTable integration tests
oruebel Apr 7, 2021
d956d1f
Fix test available namespaces testcase
oruebel Apr 7, 2021
27b35c0
Update SweepTable tutorial and declare as deprecated
oruebel Apr 7, 2021
1bcd45c
Add new icephys tutorial using the new hierarchical metadata tables
oruebel Apr 7, 2021
c70720e
Add IntracellularRecordingsTable illustration
oruebel Apr 7, 2021
533c188
Fix flake8 issues
oruebel Apr 7, 2021
de558ba
Remove use of HierarchiclaDynamicTableMixin use for now
oruebel Apr 8, 2021
cb7c1e2
Add source PowerPoint file for icephys docs
oruebel Apr 14, 2021
cc4c153
Merge branch 'dev' into add/icephys_meta
oruebel Apr 14, 2021
03cc6a3
Fix make clean command for docs to also clean sphinx-gallery files
oruebel Apr 14, 2021
68a5eee
Updated CHANGELOG to describe changes from this PR
oruebel Apr 14, 2021
2a36897
Add thumbnails for domain-specfic tutorials
oruebel Apr 14, 2021
cb3d7e6
Add thumbnaisl for general tutorials
oruebel Apr 14, 2021
f9175b3
Update changelog
oruebel Apr 14, 2021
1dc1349
Remove ic_filtering/icephys_filtering duplicate and fix test
oruebel Apr 17, 2021
281483f
Fix description of ExperimentalConditionsTable
oruebel Apr 28, 2021
becdabb
Merge branch 'dev' into add/icephys_meta
oruebel May 28, 2021
69e3c92
Use nwb schema branch for icephys meta
rly Jun 14, 2021
6ec2d65
Ensure start_index and count are set to -1 if TimeSeries is None
oruebel Jun 17, 2021
412473d
Add test for conversion of start_index and index_count vals
oruebel Jun 17, 2021
a74c6e8
Fix flake8 in __compute_index docstring
oruebel Jun 17, 2021
e4c45fe
Update schema (updated version to 1.4.0-alpha)
rly Jun 18, 2021
49069d4
Fix icephys metadata roundtrip NWBFile test base on update DynamicTab…
oruebel Jul 6, 2021
44b0d42
Merge branch 'add/icephys_meta' of https://github.com/NeurodataWithou…
oruebel Jul 6, 2021
d2a5fd5
Merge branch 'dev' into add/icephys_meta
rly Jul 7, 2021
8b9e374
Merge branch 'rc/2.0.0' into add/icephys_meta
rly Jul 7, 2021
f459435
Move SweepTable DeprectationWarning to the constructor of SweepTable …
oruebel Jul 7, 2021
89886f7
Suppress SweepTable Deprecation warning in icephys integration test
oruebel Jul 7, 2021
1233e2f
Ignore and assert deprecation warning for SweepTable during SweepTabl…
oruebel Jul 7, 2021
e066034
Merge branch 'rc/2.0.0' into add/icephys_meta
rly Jul 7, 2021
73c8fbc
Merge branch 'rc/2.0.0' into add/icephys_meta
rly Jul 7, 2021
1c0a282
Fix falke8 on icephys tutorial galleries
oruebel Jul 7, 2021
226f4ec
Merge branch 'add/icephys_meta' of https://github.com/NeurodataWithou…
oruebel Jul 7, 2021
92fb8a4
Use extension in docs to simplify linking to common targets
oruebel Jul 7, 2021
48045cd
Move advanced data I/O tutorials to their own section and move parall…
oruebel Jul 7, 2021
d9619c8
Update Changelog for tutorials
oruebel Jul 7, 2021
55effbb
Updated deprecation warning in SweepTable tutorial
oruebel Jul 7, 2021
7a3a554
Fix minor spelling error in make_test_files
oruebel Jul 16, 2021
3498f01
Several corrections and enhancments for the icephys tutorial
oruebel Jul 16, 2021
33b40b6
Moved functions to create icephys test file from tests to pynwb
oruebel Jul 16, 2021
7d39036
Moved functions to create icephys test file from tests to pynwb
oruebel Jul 16, 2021
87a057f
Start for new tutorial to show conversion of icephys tables to pandas…
oruebel Jul 16, 2021
7a0149f
Test to_hierarchical_dataframe and to_denormalized_dataframe functions
oruebel Jul 16, 2021
b968ea8
Updated tutorial to convert icephys tabels to pandas
oruebel Jul 17, 2021
b46c825
Fix minor spelling error in NWBFile docstring
oruebel Jul 17, 2021
8e8a897
Add TimeseriesReferenceVectorData.get method to mask missing values o…
oruebel Jul 17, 2021
bbd31d3
Update IntracellularRecordingsTable. Update add_recording defaults fo…
oruebel Jul 17, 2021
26ccfe0
Update create_icephys_testfile to create testdata with missing stimuli
oruebel Jul 17, 2021
be4f573
Update existing tests to match the new behavior
oruebel Jul 17, 2021
5892fad
Update icephys tutorial for match new behavior
oruebel Jul 17, 2021
a6ed1b1
Update icephys pandas tutorial to demonstrate new behavior
oruebel Jul 17, 2021
ea5db4c
Remove debug print statements
oruebel Jul 17, 2021
5873f09
Add table augmentation and queries to the icephys_pandas tutorial
oruebel Jul 17, 2021
30679aa
Fix flake8
oruebel Jul 17, 2021
0d3a4b0
Update to use keyword args
oruebel Jul 18, 2021
34bb0ed
Clarify text in icephys tutorial
oruebel Jul 19, 2021
f422d42
Fix flake8 on tutorial
oruebel Jul 20, 2021
1a27a06
Fix flake8
oruebel Jul 20, 2021
547bb52
Update to use dev branch of nwb-schema
oruebel Jul 20, 2021
9151baa
Set default name and default description for TimeSeriesReferenceVecto…
oruebel Jul 20, 2021
bf5d83c
added icephys_testutils to pynwb.testing.__init__
oruebel Jul 21, 2021
dae7c55
Added test for IntracellularRecordingsTable.to_dataframe with options
oruebel Jul 21, 2021
8acc3db
Always use MaskedArrya in TimeSeriesReferenceVectorData instead of Ma…
oruebel Jul 21, 2021
609b769
Add tests for TimeSeriesReferenceVectorData.get
oruebel Jul 21, 2021
276ed1b
Updated change log
oruebel Jul 21, 2021
60cd1a8
Merge pull request #1383 from NeurodataWithoutBorders/add/denormalize…
oruebel Jul 22, 2021
6c07e59
Fix minor docstring issue in TimeSeriesReferenceData.get
oruebel Jul 22, 2021
59bf4c1
Clarify docstring for TimeSeriesReferenceData.get
oruebel Jul 23, 2021
e080c09
Merge branch 'rc/2.0.0' into add/icephys_meta
oruebel Jul 23, 2021
361b752
Update CHANGELOG.md
rly Jul 27, 2021
97811fa
Update icephys.py
rly Jul 27, 2021
1040a56
Update docs/gallery/domain/plot_icephys.py
rly Jul 27, 2021
5786fae
Update plot_icephys.py
rly Jul 27, 2021
55f2ab3
Update plot_icephys_pandas.py
rly Jul 27, 2021
3e8ec23
Update icephys_testutils.py
rly Jul 27, 2021
6aac9dc
Update file.py
rly Jul 27, 2021
40a5937
Fix typos
rly Jul 27, 2021
2fd0d39
Update docs/gallery/domain/plot_icephys.py Replace master with main
oruebel Jul 27, 2021
cfbd10d
Mention use of create_icephys_testfile function
oruebel Jul 27, 2021
a45f092
Remove old comment in TimeSeriesReferenceVectorData
oruebel Jul 27, 2021
bdf989f
Fix typo in changelog
oruebel Jul 27, 2021
b8d87b2
Fix broken link target
oruebel Jul 27, 2021
cc46f4b
Fix flake8 in docs/gallery
oruebel Jul 27, 2021
fd0c521
Minor text fixes
rly Jul 28, 2021
564a449
Minor text and formatting fixes
rly Jul 28, 2021
dd74c0a
Update tests/unit/test_icephys_metadata_tables.py
oruebel Jul 28, 2021
79271f9
Use namedtuple instead of numpy masked structed array to represent va…
oruebel Jul 28, 2021
f50b39f
Update tests/unit/test_icephys_metadata_tables.py
oruebel Jul 28, 2021
00b4236
Update tests/unit/test_icephys_metadata_tables.py
oruebel Jul 28, 2021
ad312b4
Update tests/unit/test_icephys_metadata_tables.py
oruebel Jul 28, 2021
c797c99
Update tests/unit/test_icephys_metadata_tables.py
oruebel Jul 28, 2021
2a0a4c8
Update tests/unit/test_icephys_metadata_tables.py
oruebel Jul 28, 2021
17c8ef7
Update tests/unit/test_icephys_metadata_tables.py
oruebel Jul 28, 2021
4c15d8b
Update tests/unit/test_icephys_metadata_tables.py
oruebel Jul 28, 2021
924733d
Update tests/unit/test_icephys_metadata_tables.py
oruebel Jul 28, 2021
7d9f23e
Fix bad indent in test
oruebel Jul 29, 2021
c696b72
Update base.py - minor text edits
rly Jul 29, 2021
a58921a
Enhance introspection, slicing, and data addition for TimeSeriesRefer…
oruebel Jul 29, 2021
2dea1ee
Updated icephys query tutorial to use latest get_linked_tables behavi…
oruebel Jul 29, 2021
8061747
Fix flake8 in gallery
oruebel Jul 29, 2021
754c20c
Fix spelling in error message
oruebel Jul 29, 2021
e694247
Update src/pynwb/base.py
oruebel Jul 29, 2021
22bc9d6
Update src/pynwb/base.py
oruebel Jul 29, 2021
1c11785
Update src/pynwb/base.py
oruebel Jul 29, 2021
2065ab5
Update src/pynwb/base.py
oruebel Jul 29, 2021
36fb5ac
Update src/pynwb/base.py
oruebel Jul 29, 2021
c25d622
Update src/pynwb/base.py
oruebel Jul 29, 2021
9f3637d
Fix bug in TimeSeriesReference.timestamps and add comments
oruebel Jul 29, 2021
d64052f
Use hdmf 3.1.1
rly Jul 30, 2021
3d84c51
Merge branch 'add/icephys_meta' of https://github.com/NeurodataWithou…
rly Jul 30, 2021
fe4ce3d
Fix gallery tests to handle allensdk pinning pynwb/hdmf
rly Jul 30, 2021
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 31 additions & 11 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,27 +1,47 @@
# PyNWB Changelog

## PyNWB 2.0.0 (TBD, 2021)
## PyNWB 2.0.0 (Upcoming)

### Breaking changes:
- ``SweepTable`` has been deprecated in favor of the new icephys metadata tables. Use of ``SweepTable``
is still possible but no longer recommended. @oruebel (#1349)

### New features:
-
-
-
- Drop Python 3.6 support, add Python 3.9 support. @rly (#1377)
- Update requirements to allow compatibility with HDMF 3 and h5py 3. @rly (#1377)
- Added new intracellular electrophysiology hierarchical table structure from ndx-icephys-meta to NWB core.
This includes the new types ``TimeSeriesReferenceVectorData``, ``IntracellularRecordingsTable``,
``SimultaneousRecordingsTable``, ``SequentialRecordingsTable``, ``RepetitionsTable`` and
``ExperimentalConditionsTable`` as well as corresponding updates to ``NWBFile`` to support interaction
with the new tables. @oruebel (#1349)
- Added support for nwb-schema 2.4.0. See [Release Notes](https://nwb-schema.readthedocs.io/en/latest/format_release_notes.html)
for more details. @oruebel (#1349)
- Dropped Python 3.6 support, added Python 3.9 support. @rly (#1377)
- Updated requirements to allow compatibility with HDMF 3 and h5py 3. @rly (#1377)

### Tutorial enhancements:
- Added new tutorial for intracellular electrophysiology to describe the use of the new metadata tables
and declared the previous tutoral using ``SweepTable`` as deprecated. @oruebel (#1349)
- Added new tutorial for querying intracellular electrophysiology metadata
(``docs/gallery/domain/plot_icephys_pandas.py``). @oruebel (#1349, #1383)
- Added thumbnails for tutorials to improve presentation of online docs. @oruebel (#1349)
- Used `sphinx.ext.extlinks` extension in docs to simplify linking to common targets. @oruebel (#1349)
- Created new section for advanced I/O tutorials and moved parallel I/O tutorial to its own file. @oruebel (#1349)

### Minor new features:
- Add RRID for citing PyNWB to the docs. @oruebel (#1372)
- Update CI and tests to handle deprecations in libraries. @rly (#1377)
- Add test utilities for icephys (``pynwb.testing.icephys_testutils``) to ease creation of test data
for tests and tutorials. @oruebel (#1349, #1383)

### Bug fixes:
- Enforce electrode ID uniqueness during insertion into table. @CodyCBakerPhD (#1344)
- Fix integration tests with invalid test data that will be caught by future hdmf validator version.
- Updated behavior of ``make clean`` command for docs to ensure tutorial files are cleaned up. @oruebel (#1349)
- Enforced electrode ID uniqueness during insertion into table. @CodyCBakerPhD (#1344)
- Fixed integration tests with invalid test data that will be caught by future hdmf validator version.
@dsleiter, @rly (#1366, #1376)
- Fix build warnings in docs @oruebel (#1380)
- Fixed build warnings in docs @oruebel (#1380)

## PyNWB 1.5.1 (May 24, 2021)

### Bug fix:
### Bug fixes:
- Raise minimum version of pandas from 0.23 to 1.0.5 to be compatible with numpy 1.20, and raise minimum version of
HDMF to use the corresponding change in HDMF. @rly (#1363)
- Update documentation and update structure of requirements files. @rly (#1363)
Expand Down Expand Up @@ -64,7 +84,7 @@
- Add capability to add a row to a column after IO.
- Add method `AbstractContainer.get_fields_conf`.
- Add functionality for storing external resource references.
- Add method `hdmf.utils.get_docval_macro` to get a tuple of the current values for a docval_macro, e.g., 'array_data'
- Add method `hdmf.utils.get_docval_macro` to get a tuple of the current values for a docval_macro, e.g., 'array_data'
and 'scalar_data'.
- `DynamicTable` can be automatically generated using `get_class`. Now the HDMF API can read files with extensions
that contain a DynamicTable without needing to import the extension first.
Expand Down
2 changes: 1 addition & 1 deletion docs/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ help:
@echo " apidoc to build RST from source code"

clean:
-rm -rf $(BUILDDIR)/* $(RSTDIR)/$(PKGNAME)*.rst
-rm -rf $(BUILDDIR)/* $(RSTDIR)/$(PKGNAME)*.rst $(RSTDIR)/tutorials

html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
Expand Down
7 changes: 7 additions & 0 deletions docs/gallery/advanced_io/README.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@


.. _general-tutorials:


Advanced I/O
------------
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
'''
Advanced HDF5 I/O
=====================
Defining HDF5 Dataset I/O Settings (chunking, compression, etc.)
================================================================

The HDF5 storage backend supports a broad range of advanced dataset I/O options, such as,
chunking and compression. Here we demonstrate how to use these features
Expand All @@ -18,6 +18,7 @@
# Before we get started, lets create an NWBFile for testing so that we can add our data to it.
#

# sphinx_gallery_thumbnail_path = 'figures/gallery_thumbnails_h5dataio.png'
from datetime import datetime
from dateutil.tz import tzlocal
from pynwb import NWBFile
Expand Down Expand Up @@ -216,72 +217,6 @@
# will be ignored as the h5py.Dataset will either be linked to or copied as on write.
#

####################
# Parallel I/O using MPI
# ----------------------
#
# The HDF5 storage backend supports parallel I/O using the Message Passing Interface (MPI).
# Using this feature requires that you install ``hdf5`` and ``h5py`` against an MPI driver, and you
# install ``mpi4py``. The basic installation of pynwb will not work. Setup can be tricky, and
# is outside the scope of this tutorial (for now), and the following assumes that you have
# HDF5 installed in a MPI configuration. Here we:
#
# 1. **Instantiate a dataset for parallel write**: We create TimeSeries with 4 timestamps that we
# will write in parallel
#
# 2. **Write to that file in parallel using MPI**: Here we assume 4 MPI ranks while each rank writes
# the data for a different timestamp.
#
# 3. **Read from the file in parallel using MPI**: Here each of the 4 MPI ranks reads one time
# step from the file
#
# .. code-block:: python
#
# from mpi4py import MPI
# import numpy as np
# from dateutil import tz
# from pynwb import NWBHDF5IO, NWBFile, TimeSeries
# from datetime import datetime
# from hdmf.data_utils import DataChunkIterator
#
# start_time = datetime(2018, 4, 25, 2, 30, 3, tzinfo=tz.gettz('US/Pacific'))
# fname = 'test_parallel_pynwb.nwb'
# rank = MPI.COMM_WORLD.rank # The process ID (integer 0-3 for 4-process run)
#
# # Create file on one rank. Here we only instantiate the dataset we want to
# # write in parallel but we do not write any data
# if rank == 0:
# nwbfile = NWBFile('aa', 'aa', start_time)
# data = DataChunkIterator(data=None, maxshape=(4,), dtype=np.dtype('int'))
#
# nwbfile.add_acquisition(TimeSeries('ts_name', description='desc', data=data,
# rate=100., unit='m'))
# with NWBHDF5IO(fname, 'w') as io:
# io.write(nwbfile)
#
# # write to dataset in parallel
# with NWBHDF5IO(fname, 'a', comm=MPI.COMM_WORLD) as io:
# nwbfile = io.read()
# print(rank)
# nwbfile.acquisition['ts_name'].data[rank] = rank
#
# # read from dataset in parallel
# with NWBHDF5IO(fname, 'r', comm=MPI.COMM_WORLD) as io:
# print(io.read().acquisition['ts_name'].data[rank])

####################
# To specify details about chunking, compression and other HDF5-specific I/O options,
# we can wrap data via ``H5DataIO``, e.g,
#
# .. code-block:: python
#
# data = H5DataIO(DataChunkIterator(data=None, maxshape=(100000, 100),
# dtype=np.dtype('float')),
# chunks=(10, 10), maxshape=(None, None))
#
# would initialize your dataset with a shape of (100000, 100) and maxshape of (None, None)
# and your own custom chunking of (10, 10).

####################
# Disclaimer
# ----------------
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -100,6 +100,7 @@
# simple helper function first to write a simple NWBFile containing a single timeseries to
# avoid repetition of the same code and to allow us to focus on the important parts of this tutorial.

# sphinx_gallery_thumbnail_path = 'figures/gallery_thumbnails_iterative_write.png'
from datetime import datetime
from dateutil.tz import tzlocal
from pynwb import NWBFile, TimeSeries
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@
# In the following we are creating two :py:meth:`~pynwb.base.TimeSeries` each written to a separate file.
# We then show how we can integrate these files into a single NWBFile.


# sphinx_gallery_thumbnail_path = 'figures/gallery_thumbnails_linking_data.png'
from datetime import datetime
from dateutil.tz import tzlocal
from pynwb import NWBFile
Expand Down
81 changes: 81 additions & 0 deletions docs/gallery/advanced_io/parallelio.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
'''
Parallel I/O using MPI
======================
The HDF5 storage backend supports parallel I/O using the Message Passing Interface (MPI).
Using this feature requires that you install ``hdf5`` and ``h5py`` against an MPI driver, and you
install ``mpi4py``. The basic installation of pynwb will not work. Setup can be tricky, and
is outside the scope of this tutorial (for now), and the following assumes that you have
HDF5 installed in a MPI configuration.
'''

# sphinx_gallery_thumbnail_path = 'figures/gallery_thumbnails_parallelio.png'

####################
# Here we:
#
# 1. **Instantiate a dataset for parallel write**: We create TimeSeries with 4 timestamps that we
# will write in parallel
#
# 2. **Write to that file in parallel using MPI**: Here we assume 4 MPI ranks while each rank writes
# the data for a different timestamp.
#
# 3. **Read from the file in parallel using MPI**: Here each of the 4 MPI ranks reads one time
# step from the file
#
# .. code-block:: python
#
# from mpi4py import MPI
# import numpy as np
# from dateutil import tz
# from pynwb import NWBHDF5IO, NWBFile, TimeSeries
# from datetime import datetime
# from hdmf.data_utils import DataChunkIterator
#
# start_time = datetime(2018, 4, 25, 2, 30, 3, tzinfo=tz.gettz('US/Pacific'))
# fname = 'test_parallel_pynwb.nwb'
# rank = MPI.COMM_WORLD.rank # The process ID (integer 0-3 for 4-process run)
#
# # Create file on one rank. Here we only instantiate the dataset we want to
# # write in parallel but we do not write any data
# if rank == 0:
# nwbfile = NWBFile('aa', 'aa', start_time)
# data = DataChunkIterator(data=None, maxshape=(4,), dtype=np.dtype('int'))
#
# nwbfile.add_acquisition(TimeSeries('ts_name', description='desc', data=data,
# rate=100., unit='m'))
# with NWBHDF5IO(fname, 'w') as io:
# io.write(nwbfile)
#
# # write to dataset in parallel
# with NWBHDF5IO(fname, 'a', comm=MPI.COMM_WORLD) as io:
# nwbfile = io.read()
# print(rank)
# nwbfile.acquisition['ts_name'].data[rank] = rank
#
# # read from dataset in parallel
# with NWBHDF5IO(fname, 'r', comm=MPI.COMM_WORLD) as io:
# print(io.read().acquisition['ts_name'].data[rank])

####################
# To specify details about chunking, compression and other HDF5-specific I/O options,
# we can wrap data via ``H5DataIO``, e.g,
#
# .. code-block:: python
#
# data = H5DataIO(DataChunkIterator(data=None, maxshape=(100000, 100),
# dtype=np.dtype('float')),
# chunks=(10, 10), maxshape=(None, None))
#
# would initialize your dataset with a shape of (100000, 100) and maxshape of (None, None)
# and your own custom chunking of (10, 10).

####################
# Disclaimer
# ----------------
#
# External links included in the tutorial are being provided as a convenience and for informational purposes only;
# they do not constitute an endorsement or an approval by the authors of any of the products, services or opinions of
# the corporation or organization or individual. The authors bear no responsibility for the accuracy, legality or
# content of the external site or for that of subsequent links. Contact the external site for answers to questions
# regarding its content.
3 changes: 3 additions & 0 deletions docs/gallery/domain/brain_observatory.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,13 @@
# physiology submodule (pynwb.ophys). We will use the allensdk as a read API, while leveraging the pynwb data model and
# write api to transform and write the data back to disk.
#
# .. note: Using the latest allensdk package requires Python 3.6 or higher.

########################################
# .. raw:: html
# :url: https://gist.githubusercontent.com/nicain/82e6b3d8f9ff5b85ef01a582e41e2389/raw/

# sphinx_gallery_thumbnail_path = 'figures/gallery_thumbnails_allenbrainobservatory.png'
from datetime import datetime
from dateutil.tz import tzlocal

Expand Down
8 changes: 3 additions & 5 deletions docs/gallery/domain/ecephys.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,20 +5,18 @@
Extracellular electrophysiology data
============================================
The following examples will reference variables that may not be defined within the block they are used in. For
clarity, we define them here:
'''


import numpy as np

#######################
# Creating and Writing NWB files
# ------------------------------
#
# When creating a NWB file, the first step is to create the :py:class:`~pynwb.file.NWBFile`. The first
# argument is the name of the NWB file, and the second argument is a brief description of the dataset.

# sphinx_gallery_thumbnail_path = 'figures/gallery_thumbnails_ecephys.png'

import numpy as np
from datetime import datetime
from dateutil.tz import tzlocal
from pynwb import NWBFile
Expand Down
25 changes: 16 additions & 9 deletions docs/gallery/domain/icephys.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,17 @@
'''
.. _icephys_tutorial:
Intracellular electrophysiology data
============================================
Intracellular electrophysiology data using SweepTable
=====================================================
The following examples will reference variables that may not be defined within the block they are used in. For
clarity, we define them here:
The following tutorial describes storage of intracellular electrophysiology data in NWB using the
SweepTable to manage recordings.
.. warning::
The use of SweepTable has been deprecated as of PyNWB >v2.0 in favor of the new hierarchical
intracellular electrophysiology metadata tables to allow for a more complete description of
intracellular electrophysiology experiments. See the :doc:`Intracellular electrophysiology <plot_icephys>`
tutorial for details.
'''

#######################
Expand All @@ -16,6 +22,7 @@
# When creating a NWB file, the first step is to create the :py:class:`~pynwb.file.NWBFile`. The first
# argument is is a brief description of the dataset.

# sphinx_gallery_thumbnail_path = 'figures/gallery_thumbnails_icephys_sweeptable.png'
from datetime import datetime
from dateutil.tz import tzlocal
from pynwb import NWBFile
Expand Down Expand Up @@ -76,7 +83,7 @@
ccss = CurrentClampStimulusSeries(
name="ccss", data=[1, 2, 3, 4, 5], starting_time=123.6, rate=10e3, electrode=elec, gain=0.02, sweep_number=0)

nwbfile.add_stimulus(ccss)
nwbfile.add_stimulus(ccss, use_sweep_table=True)

#######################
# We now add another stimulus series but from a different sweep. TimeSeries
Expand All @@ -87,7 +94,7 @@
vcss = VoltageClampStimulusSeries(
name="vcss", data=[2, 3, 4, 5, 6], starting_time=234.5, rate=10e3, electrode=elec, gain=0.03, sweep_number=1)

nwbfile.add_stimulus(vcss)
nwbfile.add_stimulus(vcss, use_sweep_table=True)

#######################
# Here, we will use :py:class:`~pynwb.icephys.CurrentClampSeries` to store current clamp
Expand All @@ -102,7 +109,7 @@
electrode=elec, gain=0.02, bias_current=1e-12, bridge_balance=70e6,
capacitance_compensation=1e-12, sweep_number=0)

nwbfile.add_acquisition(ccs)
nwbfile.add_acquisition(ccs, use_sweep_table=True)

#######################
# And voltage clamp data from the second sweep using
Expand All @@ -116,7 +123,7 @@
electrode=elec, gain=0.02, capacitance_slow=100e-12, resistance_comp_correction=70.0,
sweep_number=1)

nwbfile.add_acquisition(vcs)
nwbfile.add_acquisition(vcs, use_sweep_table=True)

####################
# .. _icephys_writing:
Expand Down Expand Up @@ -183,7 +190,7 @@
# PatchClampSeries which belongs to a certain sweep number via
# :py:meth:`~pynwb.icephys.SweepTable.get_series`.
#
# The following call will return the voltage clamp data, two timeseries
# The following call will return the voltage clamp data of two timeseries
# consisting of acquisition and stimulus, from sweep 1.

series = nwbfile.sweep_table.get_series(1)
1 change: 1 addition & 0 deletions docs/gallery/domain/ophys.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
clarity, we define them here:
'''

# sphinx_gallery_thumbnail_path = 'figures/gallery_thumbnails_ophys.png'
from datetime import datetime
from dateutil.tz import tzlocal

Expand Down
Loading