Skip to content

Commit

Permalink
caution
Browse files Browse the repository at this point in the history
  • Loading branch information
Borda authored Oct 31, 2024
1 parent ea37534 commit 5f29c4d
Showing 1 changed file with 4 additions and 3 deletions.
7 changes: 4 additions & 3 deletions docs/source/pages/overview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -492,9 +492,10 @@ In practice this means that:
A functional metric is differentiable if its corresponding modular metric is differentiable.

For PyTorch versions 2.1 or higher, differentiation in DDP mode is enabled, allowing autograd graph
propagation after the ``all_gather`` operation. This is useful for synchronizing metrics used as
loss functions in a DDP setting.
.. caution::
For PyTorch versions 2.1 or higher, differentiation in DDP mode is enabled, allowing autograd graph
propagation after the ``all_gather`` operation. This is useful for synchronizing metrics used as
loss functions in a DDP setting.

***************************************
Metrics and hyperparameter optimization
Expand Down

0 comments on commit 5f29c4d

Please sign in to comment.