Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Fix flaky test: test_mkldnn.test_activation #12377 #12418

Merged
merged 2 commits into from
Sep 8, 2018

Conversation

luobao-intel
Copy link
Contributor

Description

Fix the flaky test failure : test_mkldnn.test_activation. #12377
The problem locates in the finite difference method for gradient comparison. In this case, the large eps caused the wrong calculation and this pull request reduces the eps for more accurate calculation.
@pengzhao-intel

@pengzhao-intel
Copy link
Contributor

Please enable the case which has been skipped in another PR.

@pengzhao-intel
Copy link
Contributor

@marcoabreu @lebeg please help take a reveiw

@@ -292,7 +291,7 @@ def check_activation_training(stype):
in_location = [mx.nd.array(data_tmp).tostype(stype)]

test = mx.symbol.Activation(data, act_type="relu")
check_numeric_gradient(test, in_location, numeric_eps=1e-2, rtol=0.16, atol=1e-4)
check_numeric_gradient(test, in_location, numeric_eps=1e-6, rtol=0.16, atol=1e-4)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, this change is lowering the epsilon by 4 magnitudes. If this test was flaky before due to computational precision wouldn't this change make it worse?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, it will improve the precision of the reference results because we use the smaller eps for the finite difference method.
Previously, the flaky test is caused by the large eps which can't simulate the small difference.
The numerical calculation sometimes is a little tricky :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This epsilon would have an impact on the baseline calculation instead of mkldnn calculation. And the smaller the epsilon is, the more accurate the baseline (gradient referring to theano) is. So this change won't make it worse.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pengzhao-intel @luobao-intel thanks for your explanations!

@marcoabreu
Copy link
Contributor

@szha @eric-haibin-lin

@lupesko
Copy link
Contributor

lupesko commented Sep 4, 2018

Thanks for the fix @luobao-intel !
Copying in more folks for review: @anirudh2290 @azai91 @mseth10

@eric-haibin-lin eric-haibin-lin merged commit 445967e into apache:master Sep 8, 2018
lebeg added a commit to lebeg/incubator-mxnet that referenced this pull request Sep 11, 2018
aaronmarkham pushed a commit to aaronmarkham/incubator-mxnet that referenced this pull request Sep 11, 2018
marcoabreu pushed a commit that referenced this pull request Sep 12, 2018
zhreshold added a commit that referenced this pull request Sep 12, 2018
szha pushed a commit that referenced this pull request Sep 12, 2018
* Revert "Removing the re-size for validation data, which breaking the validation accuracy of CIFAR training (#12362)"

This reverts commit ceabcaa.

* Revert "[MXNET-580] Add SN-GAN example (#12419)"

This reverts commit 46a5cee.

* Revert "Remove regression checks for website links (#12507)"

This reverts commit 619bc3e.

* Revert "Revert "Fix flaky test: test_mkldnn.test_activation #12377 (#12418)" (#12516)"

This reverts commit 7ea0533.

* Revert "further bump up tolerance for sparse dot (#12527)"

This reverts commit 90599e1.

* Revert "Fix broken URLs (#12508)"

This reverts commit 3d83c89.

* Revert "Temporarily disable flaky tests (#12520)"

This reverts commit 35ca13c.

* Revert "Add support for more req patterns for bilinear sampler backward (#12386)"

This reverts commit 4ee866f.

* Revert "Change the way NDArrayIter handle the last batch (#12285)"

This reverts commit 597a637.
anirudh2290 pushed a commit to anirudh2290/mxnet that referenced this pull request Sep 19, 2018
anirudh2290 pushed a commit to anirudh2290/mxnet that referenced this pull request Sep 19, 2018
anirudh2290 pushed a commit to anirudh2290/mxnet that referenced this pull request Sep 19, 2018
* Revert "Removing the re-size for validation data, which breaking the validation accuracy of CIFAR training (apache#12362)"

This reverts commit ceabcaa.

* Revert "[MXNET-580] Add SN-GAN example (apache#12419)"

This reverts commit 46a5cee.

* Revert "Remove regression checks for website links (apache#12507)"

This reverts commit 619bc3e.

* Revert "Revert "Fix flaky test: test_mkldnn.test_activation apache#12377 (apache#12418)" (apache#12516)"

This reverts commit 7ea0533.

* Revert "further bump up tolerance for sparse dot (apache#12527)"

This reverts commit 90599e1.

* Revert "Fix broken URLs (apache#12508)"

This reverts commit 3d83c89.

* Revert "Temporarily disable flaky tests (apache#12520)"

This reverts commit 35ca13c.

* Revert "Add support for more req patterns for bilinear sampler backward (apache#12386)"

This reverts commit 4ee866f.

* Revert "Change the way NDArrayIter handle the last batch (apache#12285)"

This reverts commit 597a637.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants