-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Flaky test: test_operator.test_layer_norm #10227
Comments
Should change to 1E-2 |
Thanks, I will make the change along with the PR. |
I find sometimes it's really hard to make the numerical check pass for all the seeds 😅 |
Same here. We should come up with a method of avoid comparing gradient values that are too close to zero. |
Happened again: http://jenkins.mxnet-ci.amazon-ml.com/blue/organizations/jenkins/incubator-mxnet/detail/PR-11986/2/pipeline
|
Happened again: #16336 (unrelated PR)
|
I guess we need to use 1E-3 |
Um.. 1E-1, 1E-2, now 1E-3.. it has to stop somewhere.. but anyway.. for temporary fix I will push that as PR then |
@ChaiBapchya Yep, would it be possible to just test a few seeds? Also, we could somehow remove the finite-difference test. |
Fixing the seed isn't a good practice is it? Removing finite difference will make this test loose right? |
@ChaiBapchya In theory it will not if our manual backward logic is correct. |
@ChaiBapchya I mean the test would be as strict as the original. Also, fixing the seed works in some randomness tests. |
It failed in one of my PRs on Windows, Python2 GPU. Please confirm whether the difference is expected. If so, consider using a bigger
atol
for comparing two values that are close to zero.http://jenkins.mxnet-ci.amazon-ml.com/blue/organizations/jenkins/incubator-mxnet/detail/PR-9552/34/pipeline/577
@sxjscience
The text was updated successfully, but these errors were encountered: