Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Commit

Permalink
doc add relu (#20193)
Browse files Browse the repository at this point in the history
  • Loading branch information
barry-jin committed Apr 20, 2021
1 parent 5da68f7 commit 7dba11a
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 0 deletions.
1 change: 1 addition & 0 deletions docs/python_docs/python/api/npx/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,7 @@ More operators
:toctree: generated/

sigmoid
relu
smooth_l1
softmax
log_softmax
Expand Down
3 changes: 3 additions & 0 deletions src/operator/numpy/np_elemwise_unary_op_basic.cc
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,11 @@ namespace op {

MXNET_OPERATOR_REGISTER_UNARY(_npx_relu)
.describe(R"code(Computes rectified linear activation.
.. math::
max(features, 0)
)code" ADD_FILELINE)
.set_attr<FCompute>("FCompute<cpu>", UnaryOp::Compute<cpu, mshadow_op::relu>)
.set_attr<nnvm::FGradient>("FGradient", ElemwiseGradUseOut{"_backward_relu"});
Expand Down

0 comments on commit 7dba11a

Please sign in to comment.