Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix build warnings in docs #1380

Merged
merged 6 commits into from
Jul 8, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
- Enforce electrode ID uniqueness during insertion into table. @CodyCBakerPhD (#1344)
- Fix integration tests with invalid test data that will be caught by future hdmf validator version.
@dsleiter, @rly (#1366, #1376)

- Fix build warnings in docs @oruebel (#1380)

## PyNWB 1.5.1 (May 24, 2021)

Expand Down
2 changes: 1 addition & 1 deletion docs/gallery/domain/ophys.py
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@
# Storing fluorescence measurements
# ---------------------------------
#
# Now that ROIs are stored, you can store fluorescence (or dF/F [#]_) data for these regions of interest.
# Now that ROIs are stored, you can store fluorescence (or dF/F) data for these regions of interest.
# This type of data is stored using the :py:class:`~pynwb.ophys.RoiResponseSeries` class. You will not need
# to instantiate this class directly to create objects of this type, but it is worth noting that this is the
# class you will work with after you read data back in.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ generates a repository with the appropriate directory structure.
After you finish the instructions `here <https://github.com/nwb-extensions/ndx-template#getting-started>`_,
you should have a directory structure that looks like this

.. code-block:: c
.. code-block:: bash

├── LICENSE.txt
├── MANIFEST.in
Expand Down Expand Up @@ -51,8 +51,8 @@ you should have a directory structure that looks like this
└── create_extension_spec.py

At its core, an NWB extension consists of YAML text files, such as those generated in the `spec`
folder. While you can write these YAML extension files by hand, PyNWB provides a convenient API
via the :py:mod:`~pynwb.spec` module for creating extensions.
folder. While you can write these YAML extension files by hand, PyNWB provides a convenient API
via the :py:mod:`~pynwb.spec` module for creating extensions.

Open ``src/spec/create_extension_spec.py``. You will be
modifying this script to create your own NWB extension. Let's first walk through each piece.
Expand Down
33 changes: 18 additions & 15 deletions docs/source/extensions_tutorial/3_spec_api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,8 @@ within it Datasets, Attributes, Links, and/or other Groups. Groups are specified

- ``neurodata_type_def`` declares the name of the neurodata type.
- ``neurodata_type_inc`` indicates what data type you are extending (Groups must extend Groups, and Datasets must extend Datasets).
- To define a new neurodata type that does not extend an existing type, use
``neurodata_type_inc=NWBContainer`` for a group or ``neurodata_type_inc=NWBData`` for a dataset.
- To define a new neurodata type that does not extend an existing type, use
``neurodata_type_inc=NWBContainer`` for a group or ``neurodata_type_inc=NWBData`` for a dataset.
``NWBContainer`` and ``NWBData`` are base types for NWB.
- To use a type that has already been defined, use ``neurodata_type_inc`` and not ``neurodata_type_def``.
- You can define a group that is not a neurodata type by omitting both ``neurodata_type_def`` and ``neurodata_type_inc``.
Expand Down Expand Up @@ -79,17 +79,20 @@ All larger blocks of numeric or text data should be stored in Datasets. Specifyi
``neurodata_type_def``, ``neurodata_type_inc``, ``doc``, ``name``, ``default_name``, ``linkable``, ``quantity``, and
``attributes`` all work the same as they do in :py:class:`~pynwb.spec.NWBGroupSpec`, described in the previous section.

``dtype`` defines the type of the data, which can be a basic type, compound type, or reference type.
See a list of options `here <https://schema-language.readthedocs.io/en/latest/description.html#dtype>`_.
Basic types can be defined as string objects and more complex types via :py:class:`~pynwb.spec.NWBDtypeSpec` and `RefSpec <https://hdmf.readthedocs.io/en/latest/hdmf.spec.spec.html#hdmf.spec.spec.RefSpec>`_.
``dtype`` defines the type of the data, which can be a basic type, compound type, or reference type.
See a list of `dtype options <https://schema-language.readthedocs.io/en/latest/description.html#dtype>`_
as part of the specification language docs. Basic types can be defined as string objects and more complex
types via :py:class:`~pynwb.spec.NWBDtypeSpec` and
`RefSpec <https://hdmf.readthedocs.io/en/latest/hdmf.spec.spec.html#hdmf.spec.spec.RefSpec>`_.


``shape`` is a specification defining the allowable shapes for the dataset. See the shape specification
`here <https://schema-language.readthedocs.io/en/latest/specification_language_description.html#shape>`_. ``None`` is
mapped to ``null``. Is no shape is provided, it is assumed that the dataset is only a single element.
``shape`` is a specification defining the allowable shapes for the dataset. See the
`shape specification <https://schema-language.readthedocs.io/en/latest/specification_language_description.html#shape>`_
as part of the specification language docs. ``None`` is mapped to ``null``. Is no shape is provided, it is
assumed that the dataset is only a single element.

If the dataset is a single element (scalar) that represents meta-data, consider using an Attribute (see
below) to store the data more efficiently instead. However, note that a Dataset can have Attributes,
If the dataset is a single element (scalar) that represents meta-data, consider using an Attribute (see
below) to store the data more efficiently instead. However, note that a Dataset can have Attributes,
whereas an Attribute cannot have Attributes of its own.
``dims`` provides labels for each dimension of ``shape``.

Expand Down Expand Up @@ -139,16 +142,16 @@ defined in the ``attributes`` field of a :py:class:`~pynwb.spec.NWBGroupSpec` or
neurodata type, i.e., the ``neurodata_type_def`` and ``neurodata_type_inc`` keys are not allowed. The only way to match an object with a spec is through the name of the attribute so ``name`` is
required. You cannot have multiple attributes on a single group/dataset that correspond to the same
:py:class:`~pynwb.spec.NWBAttributeSpec`, since these would have to have the same name. Therefore, instead of
specifying number of ``quantity``, you have a ``required`` field which takes a boolean value. Another
specifying number of ``quantity``, you have a ``required`` field which takes a boolean value. Another
key difference between datasets and attributes is that attributes cannot have attributes of their own.

.. tip::
Dataset or Attribute? It is often possible to store data as either a Dataset or an Attribute. Our best advice is
to keep Attributes small. In HDF5 the typical size limit for attributes is 64Kbytes. If an attribute is going to
store more than 64Kbyte, then make it a Dataset. Attributes are also more efficient for storing very
small data, such as scalars. However, attributes cannot have attributes of their own, and in HDF5,
to keep Attributes small. In HDF5 the typical size limit for attributes is 64Kbytes. If an attribute is going to
store more than 64Kbyte, then make it a Dataset. Attributes are also more efficient for storing very
small data, such as scalars. However, attributes cannot have attributes of their own, and in HDF5,
I/O filters, such as compression and chunking, cannot apply to attributes.


Link Specifications
^^^^^^^^^^^^^^^^^^^
Expand Down
2 changes: 2 additions & 0 deletions docs/source/extensions_tutorial/extensions_tutorial_home.rst
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
.. _extending-nwb:

Extending NWB
=============

Expand Down
36 changes: 0 additions & 36 deletions docs/source/tutorial_source/convert.rst

This file was deleted.