Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix fc_mkldnn format issue #38890

Merged
merged 1 commit into from
Feb 2, 2022

Conversation

zuzg
Copy link
Contributor

@zuzg zuzg commented Jan 11, 2022

PR types

Bug fixes

PR changes

OPs

Describe

Adding fc mldnn passes in ernie int8 model caused format error, so the fix is to change NHWC to NCHW in this narrow case. It is a workaround to error "DNNL FC only supports in_num_col_dims equal to 2 when input format is equal to ncw."

@paddle-bot-old
Copy link

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

wozna
wozna previously approved these changes Jan 11, 2022
Copy link
Contributor

@wozna wozna left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@lidanqing-intel lidanqing-intel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@zuzg
Copy link
Contributor Author

zuzg commented Jan 13, 2022

Hi @Aganlengzi could you please review?

op->Op()->GetAttrIfExists<int>("in_num_col_dims") == 2) {
q_desc.SetAttr("output_format", Has("data_layout")
? Get<std::string>("data_layout")
: "NCHW");
Copy link
Contributor

@lidanqing-intel lidanqing-intel Jan 16, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AddAttr<int>("in_num_col_dims",
                 "(int, default 1), The fc op can take tensors with more than "
                 "two dimensions as its inputs.")
        .SetDefault(1)

Hi, does it support "in_num_col_dims") == 1 too, cause in_num_col_dims by default is 1. Sorry I did not review clearly last time.
like this ? Any issue?

 if (op->Op()->Type() == "fc" &&
      op->Op()->GetAttrIfExists<int>("in_num_col_dims") <= 2) {

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are no issues for fc and in_num_col_dims == 1. The problem is seen only in that narrow cased mentioned in description. For example test_analyzer_int8_resnet50 , which has fc passes enabled and fc op is quantized has no issues.

Copy link
Contributor

@lidanqing-intel lidanqing-intel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@paddle-bot-old
Copy link

Sorry to inform you that 2dd308f's CIs have passed for more than 7 days. To prevent PR conflicts, you need to re-run all CIs manually.

@lidanqing-intel
Copy link
Contributor

@baoachun Could you please review this PR so that we can ask @Aganlengzi to merge it?

@wozna wozna requested a review from baoachun January 26, 2022 10:15
baoachun
baoachun previously approved these changes Jan 26, 2022
Copy link
Contributor

@baoachun baoachun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@wozna wozna closed this Jan 26, 2022
@wozna wozna reopened this Jan 26, 2022
Copy link
Contributor

@wozna wozna left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@lidanqing-intel lidanqing-intel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@lidanqing-intel lidanqing-intel merged commit 633c71c into PaddlePaddle:develop Feb 2, 2022
@lidanqing-intel
Copy link
Contributor

Since Baoachun already reviewed this PR and left LGTM hence I just merged this PR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants