Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Fix flaky test: test_mkldnn.test_activation #12377 #12418

Merged
merged 2 commits into from
Sep 8, 2018
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions tests/python/mkl/test_mkldnn.py
Original file line number Diff line number Diff line change
Expand Up @@ -281,7 +281,6 @@ def check_pooling_training(stype):
check_pooling_training(stype)


@unittest.skip("Flaky test: https://github.com/apache/incubator-mxnet/issues/12377")
@with_seed()
def test_activation():
def check_activation_training(stype):
Expand All @@ -292,7 +291,7 @@ def check_activation_training(stype):
in_location = [mx.nd.array(data_tmp).tostype(stype)]

test = mx.symbol.Activation(data, act_type="relu")
check_numeric_gradient(test, in_location, numeric_eps=1e-2, rtol=0.16, atol=1e-4)
check_numeric_gradient(test, in_location, numeric_eps=1e-6, rtol=0.16, atol=1e-4)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, this change is lowering the epsilon by 4 magnitudes. If this test was flaky before due to computational precision wouldn't this change make it worse?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, it will improve the precision of the reference results because we use the smaller eps for the finite difference method.
Previously, the flaky test is caused by the large eps which can't simulate the small difference.
The numerical calculation sometimes is a little tricky :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This epsilon would have an impact on the baseline calculation instead of mkldnn calculation. And the smaller the epsilon is, the more accurate the baseline (gradient referring to theano) is. So this change won't make it worse.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pengzhao-intel @luobao-intel thanks for your explanations!


stypes = ['row_sparse', 'default']
for stype in stypes:
Expand Down