Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Integrate MKL-DNN leakyrelu #16075

Merged
merged 9 commits into from
Sep 24, 2019
Merged

Conversation

xinyu-intel
Copy link
Contributor

Description

Integrate LeakyRelu operator with MKL-DNN activation primitives when the act_type is leakyrelu and elu(gelu will be available when #16073 is ready)

Main Change:

  • Register LeakyRelu operator with NNVM.
  • Share MKL-DNN integration code with Activation operator.
  • Support mkl-dnn elu and leakyrelu.
  • Support conv + leakyrelu fusion and quantization, which appears as a common structure in darknet and yolov3.

Checklist

Essentials

Please feel free to remove inapplicable items for your PR.

  • The PR title starts with [MXNET-$JIRA_ID], where $JIRA_ID refers to the relevant JIRA issue created (except PRs with tiny changes)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage:
  • Unit tests are added for small changes to verify correctness (e.g. adding a new operator)
  • Nightly tests are added for complicated/long-running ones (e.g. changing distributed kvstore)
  • Build tests will be added for build configuration changes (e.g. adding a new build option with NCCL)
  • Code is well-documented:
  • For user-facing API changes, API doc string has been updated.
  • For new C++ functions in header files, their functionalities and arguments are documented.
  • For new examples, README.md is added to explain the what the example does, the source of the dataset, expected performance on test set and reference to the original paper if applicable
  • Check the API doc at http://mxnet-ci-doc.s3-accelerate.dualstack.amazonaws.com/PR-$PR_ID/$BUILD_ID/index.html
  • To the my best knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change

Changes

  • Feature1, tests, (and when applicable, API doc)
  • Feature2, tests, (and when applicable, API doc)

Comments

  • If this change is a backward incompatible change, why must this change be made.
  • Interesting edge cases to note here

src/operator/leaky_relu-inl.h Outdated Show resolved Hide resolved
@xinyu-intel xinyu-intel changed the title [WIP]Integrate MKL-DNN leakyrelu Integrate MKL-DNN leakyrelu Sep 3, 2019
src/operator/leaky_relu-inl.h Show resolved Hide resolved
src/operator/leaky_relu-inl.h Outdated Show resolved Hide resolved
src/operator/leaky_relu.cc Outdated Show resolved Hide resolved
src/operator/leaky_relu.cc Outdated Show resolved Hide resolved
const LeakyReLUParam& param = nnvm::get<LeakyReLUParam>(attrs.parsed);
if (SupportMKLDNNLeakyRelu(param, inputs[0])) {
MKLDNN_OPCHECK_INIT(true, outputs.size(), inputs, outputs);
MKLDNNLeakyReluBackward(attrs, ctx, inputs.at(0), inputs.at(1), req[0],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only two inputs are needed? Is it possible to use vector so we can have a more unified interface?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use inputs vector directly and we need first two of three NDArrays, so add a CHECK_GE(inputs, 2U). Is it make sense?

src/operator/leaky_relu.cc Outdated Show resolved Hide resolved
src/operator/leaky_relu.cc Outdated Show resolved Hide resolved
src/operator/nn/mkldnn/mkldnn_act.cc Outdated Show resolved Hide resolved
src/operator/nn/mkldnn/mkldnn_ops-inl.h Outdated Show resolved Hide resolved
tests/python/unittest/test_gluon.py Outdated Show resolved Hide resolved
@pengzhao-intel
Copy link
Contributor

@xinyu-intel the MKL-DNN is updated to 0.21 now.

@xinyu-intel
Copy link
Contributor Author

@pengzhao-intel I'll start another PR to enable gelu after this PR merged:)

Copy link
Contributor

@pengzhao-intel pengzhao-intel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM
@TaoLv @ciyongch please take a review :)

@pengzhao-intel
Copy link
Contributor

@pengzhao-intel
Copy link
Contributor

@xinyu-intel is this ready to merge?

@xinyu-intel
Copy link
Contributor Author

yes

@pengzhao-intel
Copy link
Contributor

Merging now.

@pengzhao-intel pengzhao-intel merged commit a77bd75 into apache:master Sep 24, 2019
drivanov pushed a commit to drivanov/incubator-mxnet that referenced this pull request Sep 26, 2019
* add mkldnn leakyrelu support

* improve mkldnn act param

* register gpu path

* remove old code

* trigger

* fix lint and improve backward function
larroy pushed a commit to larroy/mxnet that referenced this pull request Sep 28, 2019
* add mkldnn leakyrelu support

* improve mkldnn act param

* register gpu path

* remove old code

* trigger

* fix lint and improve backward function
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants