From 5f29c4d1164bf4fa21ab4a88ba4f327baf4d72ef Mon Sep 17 00:00:00 2001 From: Jirka Borovec <6035284+Borda@users.noreply.github.com> Date: Thu, 31 Oct 2024 17:29:46 +0100 Subject: [PATCH] caution --- docs/source/pages/overview.rst | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/docs/source/pages/overview.rst b/docs/source/pages/overview.rst index a5d10377c92..34d0dcbd6fc 100644 --- a/docs/source/pages/overview.rst +++ b/docs/source/pages/overview.rst @@ -492,9 +492,10 @@ In practice this means that: A functional metric is differentiable if its corresponding modular metric is differentiable. -For PyTorch versions 2.1 or higher, differentiation in DDP mode is enabled, allowing autograd graph -propagation after the ``all_gather`` operation. This is useful for synchronizing metrics used as -loss functions in a DDP setting. +.. caution:: + For PyTorch versions 2.1 or higher, differentiation in DDP mode is enabled, allowing autograd graph + propagation after the ``all_gather`` operation. This is useful for synchronizing metrics used as + loss functions in a DDP setting. *************************************** Metrics and hyperparameter optimization