Skip to content

Commit

Permalink
docs: pin version in links to external docs (#2435)
Browse files Browse the repository at this point in the history
* docs: pin version in links to external docs
* lightning-utilities >=0.11.0, <0.12.0
  • Loading branch information
Borda authored Mar 19, 2024
1 parent 20b8d60 commit 3c5ceeb
Show file tree
Hide file tree
Showing 10 changed files with 35 additions and 13 deletions.
4 changes: 3 additions & 1 deletion .github/workflows/docs-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,9 @@ jobs:
- name: Full build for deployment
if: github.event_name != 'pull_request'
run: echo "SPHINX_FETCH_ASSETS=1" >> $GITHUB_ENV
run: |
echo "SPHINX_FETCH_ASSETS=1" >> $GITHUB_ENV
echo "SPHINX_PIN_RELEASE_VERSIONS=1" >> $GITHUB_ENV
- name: make ${{ matrix.target }}
working-directory: ./docs
run: make ${{ matrix.target }} --debug --jobs $(nproc) SPHINXOPTS="-W --keep-going"
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ TorchMetrics is a collection of 100+ PyTorch metrics implementations and an easy
- Metrics optimized for distributed-training
- Automatic synchronization between multiple devices

You can use TorchMetrics with any PyTorch model or with [PyTorch Lightning](https://pytorch-lightning.readthedocs.io/en/stable/) to enjoy additional features such as:
You can use TorchMetrics with any PyTorch model or with [PyTorch Lightning](https://lightning.ai/docs/pytorch/stable/) to enjoy additional features such as:

- Module metrics are automatically placed on the correct device.
- Native support for logging metrics in Lightning to reduce even more boilerplate.
Expand Down
24 changes: 22 additions & 2 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@

import lai_sphinx_theme
import torchmetrics
from lightning_utilities.docs import fetch_external_assets
from lightning_utilities.docs import adjust_linked_external_docs, fetch_external_assets
from lightning_utilities.docs.formatting import _transform_changelog

_PATH_HERE = os.path.abspath(os.path.dirname(__file__))
Expand All @@ -30,6 +30,7 @@
FOLDER_GENERATED = "generated"
SPHINX_MOCK_REQUIREMENTS = int(os.environ.get("SPHINX_MOCK_REQUIREMENTS", True))
SPHINX_FETCH_ASSETS = int(os.environ.get("SPHINX_FETCH_ASSETS", False))
SPHINX_PIN_RELEASE_VERSIONS = int(os.getenv("SPHINX_PIN_RELEASE_VERSIONS", False))

html_favicon = "_static/images/icon.svg"

Expand Down Expand Up @@ -86,6 +87,25 @@ def _set_root_image_path(page_path: str) -> None:
for page in all_pages:
_set_root_image_path(page)


if SPHINX_PIN_RELEASE_VERSIONS:
adjust_linked_external_docs(
"https://numpy.org/doc/stable/", "https://numpy.org/doc/{numpy.__version__}/", _PATH_ROOT
)
adjust_linked_external_docs(
"https://pytorch.org/docs/stable/", "https://pytorch.org/docs/{torch.__version__}/", _PATH_ROOT
)
adjust_linked_external_docs(
"https://matplotlib.org/stable/",
"https://matplotlib.org/{matplotlib.__version__}/",
_PATH_ROOT,
version_digits=3,
)
adjust_linked_external_docs(
"https://scikit-learn.org/stable/", "https://scikit-learn.org/{sklearn.__version__}/", _PATH_ROOT
)


# -- General configuration ---------------------------------------------------

# If your documentation needs a minimal Sphinx version, state it here.
Expand Down Expand Up @@ -263,7 +283,7 @@ def _set_root_image_path(page_path: str) -> None:
"python": ("https://docs.python.org/3", None),
"torch": ("https://pytorch.org/docs/stable/", None),
"numpy": ("https://numpy.org/doc/stable/", None),
"matplotlib": ("http://matplotlib.org/stable", None),
"matplotlib": ("https://matplotlib.org/stable/", None),
}
nitpicky = True

Expand Down
2 changes: 1 addition & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ TorchMetrics is a collection of 100+ PyTorch metrics implementations and an easy
* Automatic accumulation over batches
* Automatic synchronization between multiple devices

You can use TorchMetrics in any PyTorch model, or within `PyTorch Lightning <https://pytorch-lightning.readthedocs.io/en/stable/>`_ to enjoy the following additional benefits:
You can use TorchMetrics in any PyTorch model, or within `PyTorch Lightning <https://lightning.ai/docs/pytorch/stable/>`_ to enjoy the following additional benefits:

* Your data will always be placed on the same device as your metrics
* You can log :class:`~torchmetrics.Metric` objects directly in Lightning to reduce even more boilerplate
Expand Down
6 changes: 3 additions & 3 deletions docs/source/pages/lightning.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,11 +22,11 @@ While TorchMetrics was built to be used with native PyTorch, using TorchMetrics
* Modular metrics are automatically placed on the correct device when properly defined inside a LightningModule.
This means that your data will always be placed on the same device as your metrics. No need to call ``.to(device)`` anymore!
* Native support for logging metrics in Lightning using
`self.log <https://pytorch-lightning.readthedocs.io/en/stable/extensions/logging.html#logging-from-a-lightningmodule>`_ inside
`self.log <https://lightning.ai/docs/pytorch/stable/extensions/logging.html#logging-from-a-lightningmodule>`_ inside
your LightningModule.
* The ``.reset()`` method of the metric will automatically be called at the end of an epoch.

The example below shows how to use a metric in your `LightningModule <https://pytorch-lightning.readthedocs.io/en/stable/common/lightning_module.html>`_:
The example below shows how to use a metric in your `LightningModule <https://lightning.ai/docs/pytorch/stable/common/lightning_module.html>`_:

.. testcode:: python

Expand Down Expand Up @@ -64,7 +64,7 @@ Logging TorchMetrics

Logging metrics can be done in two ways: either logging the metric object directly or the computed metric values.
When :class:`~torchmetrics.Metric` objects, which return a scalar tensor are logged directly in Lightning using the
LightningModule `self.log <https://pytorch-lightning.readthedocs.io/en/stable/extensions/logging.html#logging-from-a-lightningmodule>`_
LightningModule `self.log <https://lightning.ai/docs/pytorch/stable/extensions/logging.html#logging-from-a-lightningmodule>`_
method, Lightning will log the metric based on ``on_step`` and ``on_epoch`` flags present in ``self.log(...)``. If
``on_epoch`` is True, the logger automatically logs the end of epoch metric value by calling ``.compute()``.

Expand Down
2 changes: 1 addition & 1 deletion docs/source/pages/overview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ be moved to the same device as the input of the metric:
print(out.device) # cuda:0
However, when **properly defined** inside a :class:`~torch.nn.Module` or
`LightningModule <https://pytorch-lightning.readthedocs.io/en/stable/common/lightning_module.html>`_ the metric will
`LightningModule <https://lightning.ai/docs/pytorch/stable/common/lightning_module.html>`_ the metric will
be automatically moved to the same device as the module when using ``.to(device)``. Being
**properly defined** means that the metric is correctly identified as a child module of the
model (check ``.children()`` attribute of the model). Therefore, metrics cannot be placed
Expand Down
2 changes: 1 addition & 1 deletion docs/source/pages/quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ TorchMetrics is a collection of 100+ PyTorch metrics implementations and an easy
* Automatic accumulation over batches
* Automatic synchronization between multiple devices

You can use TorchMetrics in any PyTorch model, or within `PyTorch Lightning <https://pytorch-lightning.readthedocs.io/en/stable/>`_ to enjoy additional features:
You can use TorchMetrics in any PyTorch model, or within `PyTorch Lightning <https://lightning.ai/docs/pytorch/stable/>`_ to enjoy additional features:

* This means that your data will always be placed on the same device as your metrics.
* Native support for logging metrics in Lightning to reduce even more boilerplate.
Expand Down
2 changes: 1 addition & 1 deletion requirements/_docs.txt
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ sphinx-togglebutton ==0.3.2
sphinx-copybutton ==0.5.2

lightning >=1.8.0, <2.3.0
lightning-utilities >=0.9.0, <0.11.0
lightning-utilities >=0.11.0, <0.12.0
pydantic > 1.0.0, < 3.0.0

# integrations
Expand Down
2 changes: 1 addition & 1 deletion requirements/base.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,4 @@ numpy >1.20.0
packaging >17.1
torch >=1.10.0, <=2.2.1
typing-extensions; python_version < '3.9'
lightning-utilities >=0.8.0, <0.11.0
lightning-utilities >=0.8.0, <0.12.0
2 changes: 1 addition & 1 deletion src/torchmetrics/__about__.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
__docs_url__ = "https://lightning.ai/docs/torchmetrics/stable/"
__long_doc__ = """
Torchmetrics is a metrics API created for easy metric development and usage in both PyTorch and
[PyTorch Lightning](https://pytorch-lightning.readthedocs.io/en/stable/). It was originally a part of
[PyTorch Lightning](https://lightning.ai/docs/pytorch/stable/). It was originally a part of
Pytorch Lightning, but got split off so users could take advantage of the large collection of metrics
implemented without having to install Pytorch Lightning (even though we would love for you to try it out).
We currently have around 100+ metrics implemented and we continuously are adding more metrics, both within
Expand Down

0 comments on commit 3c5ceeb

Please sign in to comment.