Skip to content

Commit

Permalink
fix prelu, test=develop (PaddlePaddle#2478)
Browse files Browse the repository at this point in the history
  • Loading branch information
qili93 authored Aug 26, 2020
1 parent 1d11f57 commit f597cfb
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 7 deletions.
2 changes: 1 addition & 1 deletion doc/fluid/api_cn/nn_cn/activation_cn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ activation
activation_cn/LeakyReLU_cn.rst
activation_cn/LogSigmoid_cn.rst
activation_cn/LogSoftmax_cn.rst
activation_cn/PRelu_cn.rst
activation_cn/PReLU_cn.rst
activation_cn/ReLU_cn.rst
activation_cn/ReLU6_cn.rst
activation_cn/SELU_cn.rst
Expand Down
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
.. _cn_api_nn_PRelu:
.. _cn_api_nn_PReLU:

PRelu
PReLU
-------------------------------
.. py:class:: paddle.nn.PRelu(num_parameters=1, init=0.25, weight_attr=None, name=None)
.. py:class:: paddle.nn.PReLU(num_parameters=1, init=0.25, weight_attr=None, name=None)
PRelu激活层(PRelu Activation Operator)。计算公式如下:
PReLU激活层(PReLU Activation Operator)。计算公式如下:

如果使用近似计算:

Expand All @@ -23,7 +23,7 @@ PRelu激活层(PRelu Activation Operator)。计算公式如下:

形状:
::::::::::
- input: 任意形状的Tensor。
- input: 任意形状的Tensor,默认数据类型为float32
- output: 和input具有相同形状的Tensor。

代码示例
Expand All @@ -35,13 +35,14 @@ PRelu激活层(PRelu Activation Operator)。计算公式如下:
import numpy as np
paddle.disable_static()
paddle.set_default_dtype("float64")
data = np.array([[[[-2.0, 3.0, -4.0, 5.0],
[ 3.0, -4.0, 5.0, -6.0],
[-7.0, -8.0, 8.0, 9.0]],
[[ 1.0, -2.0, -3.0, 4.0],
[-5.0, 6.0, 7.0, -8.0],
[ 6.0, 7.0, 8.0, 9.0]]]], 'float32')
[ 6.0, 7.0, 8.0, 9.0]]]], 'float64')
x = paddle.to_tensor(data)
m = paddle.nn.PReLU(1, 0.25)
out = m(x)
Expand Down

0 comments on commit f597cfb

Please sign in to comment.