-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Conversation
src/operator/leaky_relu.cc
Outdated
const LeakyReLUParam& param = nnvm::get<LeakyReLUParam>(attrs.parsed); | ||
if (SupportMKLDNNLeakyRelu(param, inputs[0])) { | ||
MKLDNN_OPCHECK_INIT(true, outputs.size(), inputs, outputs); | ||
MKLDNNLeakyReluBackward(attrs, ctx, inputs.at(0), inputs.at(1), req[0], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only two inputs are needed? Is it possible to use vector so we can have a more unified interface?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use inputs vector directly and we need first two of three NDArrays, so add a CHECK_GE(inputs, 2U)
. Is it make sense?
@xinyu-intel the MKL-DNN is updated to 0.21 now. |
@pengzhao-intel I'll start another PR to enable gelu after this PR merged:) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@xinyu-intel is this ready to merge? |
yes |
Merging now. |
* add mkldnn leakyrelu support * improve mkldnn act param * register gpu path * remove old code * trigger * fix lint and improve backward function
* add mkldnn leakyrelu support * improve mkldnn act param * register gpu path * remove old code * trigger * fix lint and improve backward function
Description
Integrate
LeakyRelu
operator with MKL-DNN activation primitives when theact_type
isleakyrelu
andelu
(gelu will be available when #16073 is ready)Main Change:
LeakyRelu
operator with NNVM.Activation
operator.elu
andleakyrelu
.conv + leakyrelu
fusion and quantization, which appears as a common structure in darknet and yolov3.Checklist
Essentials
Please feel free to remove inapplicable items for your PR.
Changes
Comments