Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hackathon No.21 #4723

Merged
merged 11 commits into from
Jul 27, 2022
4 changes: 4 additions & 0 deletions docs/api/paddle/nn/Overview_cn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -258,10 +258,12 @@ Loss层
" :ref:`paddle.nn.MSELoss <cn_api_paddle_nn_MSELoss>` ", "均方差误差损失层"
" :ref:`paddle.nn.NLLLoss <cn_api_nn_loss_NLLLoss>` ", "NLLLoss层"
" :ref:`paddle.nn.SmoothL1Loss <cn_api_paddle_nn_SmoothL1Loss>` ", "平滑L1损失层"
" :ref:`paddle.nn.SoftMarginLoss <cn_api_paddle_nn_SoftMarginLoss>` ", "SoftMarginLoss层"
" :ref:`paddle.nn.TripletMarginLoss <cn_api_paddle_nn_TripletMarginLoss>` ", "TripletMarginLoss层"
" :ref:`paddle.nn.TripletMarginWithDistanceLoss <cn_api_paddle_nn_TripletMarginWithDistanceLoss>` ", "TripletMarginWithDistanceLoss层"
" :ref:`paddle.nn.MultiLabelSoftMarginLoss <cn_api_paddle_nn_MultiLabelSoftMarginLoss>` ", "多标签Hinge损失层"


.. _vision_layers:

Vision层
Expand Down Expand Up @@ -482,9 +484,11 @@ Embedding相关函数
" :ref:`paddle.nn.functional.smooth_l1_loss <cn_paddle_nn_functional_loss_smooth_l1>` ", "用于计算平滑L1损失"
" :ref:`paddle.nn.functional.softmax_with_cross_entropy <cn_api_fluid_layers_softmax_with_cross_entropy>` ", "将softmax操作、交叉熵损失函数的计算过程进行合并"
" :ref:`paddle.nn.functional.margin_cross_entropy <cn_api_paddle_nn_functional_margin_cross_entropy>` ", "支持 ``Arcface``,``Cosface``,``Sphereface`` 的结合 Margin 损失函数"
" :ref:`paddle.nn.functional.soft_margin_loss <cn_api_paddle_nn_functional_soft_margin_loss>` ", "用于计算soft margin loss损失函数"
" :ref:`paddle.nn.functional.triplet_margin_loss <cn_api_paddle_nn_functional_triplet_margin_loss>` ", "用于计算TripletMarginLoss"
" :ref:`paddle.nn.functional.triplet_margin_with_distance_loss <cn_api_paddle_nn_functional_triplet_margin_with_distance_loss>` ", "用户自定义距离函数用于计算triplet margin loss 损失"
" :ref:`paddle.nn.functional.multi_label_soft_margin_loss <cn_api_nn_functional_multi_label_soft_margin_loss>` ", "用于计算多分类的hinge loss损失函数"


.. _common_functional:

Expand Down
38 changes: 38 additions & 0 deletions docs/api/paddle/nn/SoftMarginLoss_cn.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
.. _cn_api_paddle_nn_SoftMarginloss:

SoftMarginloss
-------------------------------

.. py:class:: paddle.nn.SoftMarginloss((reduction='mean', name=None)

生成一个可以计算输入 `input` 和 `label` 间的二分类损失的类。


损失函数按照下列公式计算

.. math::
\text{loss}(x, y) = \sum_i \frac{\log(1 + \exp(-y[i]*x[i]))}{\text{x.nelement}()}


最后,添加 `reduce` 操作到前面的输出Out上。当 `reduction` 为 `none` 时,直接返回最原始的 `Out` 结果。当 `reduction` 为 `mean` 时,返回输出的均值 :math:`Out = MEAN(Out)` 。当 `reduction` 为 `sum` 时,返回输出的求和 :math:`Out = SUM(Out)` 。
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这一段英文没有,建议补充一下



参数
:::::::::
- **reduction** (str,可选) - 指定应用于输出结果的计算方式,可选值有: ``'none'``, ``'mean'``, ``'sum'`` 。默认为 ``'mean'``,计算 Loss 的均值;设置为 ``'sum'`` 时,计算 Loss 的总和;设置为 ``'none'`` 时,则返回原始Loss。
- **name** (str,可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name` 。

形状
:::::::::
- **input** (Tensor) - :math:`[N, *]` , 其中N是batch_size, `*` 是任意其他维度。数据类型是float32、float64。
- **label** (Tensor) - :math:`[N, *]` ,标签 ``label`` 的维度、数据类型与输入 ``input`` 相同。
- **output** (Tensor) - 输出的Tensor。如果 :attr:`reduction` 是 ``'none'``,则输出的维度为 :math:`[N, *]`,与输入 ``input`` 的形状相同。如果 :attr:`reduction` 是 ``'mean'`` 或 ``'sum'``,则输出的维度为 :math:`[1]` 。

返回
:::::::::
返回计算SoftMarginLoss的可调用对象。


代码示例
:::::::::
COPY-FROM: paddle.nn.SoftMarginLoss
35 changes: 35 additions & 0 deletions docs/api/paddle/nn/functional/soft_margin_loss_cn.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
.. _cn_api_paddle_nn_functional_soft_margin_losss:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

多了个 s


soft_margin_loss
-------------------------------

.. py:function:: paddle.nn.functional.soft_margin_loss(input, label, reduction='mean', name=None)

计算输入 `input` 和 `label` 间的二分类损失。


损失函数按照下列公式计算

.. math::
\text{loss}(x, y) = \sum_i \frac{\log(1 + \exp(-y[i]*x[i]))}{\text{x.nelement}()}


最后,添加 `reduce` 操作到前面的输出Out上。当 `reduction` 为 `none` 时,直接返回最原始的 `Out` 结果。当 `reduction` 为 `mean` 时,返回输出的均值 :math:`Out = MEAN(Out)` 。当 `reduction` 为 `sum` 时,返回输出的求和 :math:`Out = SUM(Out)` 。


参数
:::::::::
- **input** (Tensor) - :math:`[N, *]` ,其中N是batch_size, `*` 是任意其他维度。数据类型是float32、float64。
- **label** (Tensor) - :math:`[N, *]` ,标签 ``label`` 的维度、数据类型与输入 ``input`` 相同。
- **reduction** (str,可选) - 指定应用于输出结果的计算方式,可选值有: ``'none'``、 ``'mean'``、 ``'sum'`` 。默认为 ``'mean'``,计算 Loss 的均值;设置为 ``'sum'`` 时,计算 Loss 的总和;设置为 ``'none'`` 时,则返回原始 Loss。
- **name** (str,可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name` 。


返回
:::::::::
- 输出的结果Tensor。如果 :attr:`reduction` 是 ``'none'``, 则输出的维度为 :math:`[N, *]` ,与输入 ``input`` 的形状相同。如果 :attr:`reduction` 是 ``'mean'`` 或 ``'sum'``,则输出的维度为 :math:`[1]` 。


代码示例
:::::::::
COPY-FROM: paddle.nn.functional.soft_margin_loss