-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Contrib] Add MKL DNN option #4323
Changes from 1 commit
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -31,6 +31,9 @@ extern "C" { | |
#else | ||
#include <cblas.h> | ||
#endif | ||
#if USE_MKL_DNN == 1 | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think code logic here need a small change as well:
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @ZhennanQin As @icemelon9 mentioned here, we need both cblas and dnnl because the latter is used for sgemm only. |
||
#include <dnnl.h> | ||
#endif | ||
} | ||
|
||
namespace tvm { | ||
|
@@ -40,12 +43,19 @@ using namespace runtime; | |
|
||
inline CBLAS_TRANSPOSE BooleanToTranspose(bool trans) { return trans ? CblasTrans : CblasNoTrans; } | ||
|
||
inline char BooleanToTransposeChar(bool trans) { return trans ? 'T' : 'N'; } | ||
|
||
struct CblasSgemmOp { | ||
typedef float TDatatype; | ||
void operator()(bool ta, bool tb, int M, int N, int K, float alpha, float* A, int lda, float* B, | ||
int ldb, float beta, float* C, int ldc) { | ||
#if USE_MKL_DNN == 1 | ||
dnnl_sgemm(BooleanToTransposeChar(tb), BooleanToTransposeChar(ta), N, M, K, alpha, B, | ||
ldb, A, lda, beta, C, ldc); | ||
#else | ||
cblas_sgemm(CblasColMajor, BooleanToTranspose(ta), BooleanToTranspose(tb), M, N, K, alpha, A, | ||
lda, B, ldb, beta, C, ldc); | ||
#endif | ||
} | ||
}; | ||
|
||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -32,7 +32,7 @@ def _declaration_dense(cfg, data, weight, bias=None, out_dtype=None): | |
if "cblas" in target.libs: | ||
C = cblas.matmul(data, weight, False, True) | ||
if bias is not None: | ||
C = tvm.compute(C.shape, lambda i, j: C[i, j] + bias[j].astype(out_dtype), | ||
C = tvm.compute(C.shape, lambda i, j: C[i, j] + bias[j], | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This change doesn't seem to be related to mkl dnn? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. yes, this is to fix the bug when using cblas library |
||
tag=tag.BROADCAST) | ||
return C | ||
|
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the strategy if both
USE_BLAS
ANDUSE_MKL_DNN
are set?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated the logic here. MKL DNN will only be used when USE_BLAS is not
none
. When both are set, MKL DNN will be only used in the sgemm op as the library has limited support for the BLAS operators. And I find that MKL DNN kernel achieves better performance than MKL in TVM.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another minor nit is to change
USE_MKL_DNN
to eitherUSE_DNNL
orUSE_MKLDNN
. The first one aligns with the renaming trend of the library. The second one follows the coding convention in MKL-DNN before renaming and in other projects like MXNet.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the quick turnaround.