Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

flaky test: test_bilinear_resize_op #14902

Closed
arcadiaphy opened this issue May 7, 2019 · 7 comments
Closed

flaky test: test_bilinear_resize_op #14902

arcadiaphy opened this issue May 7, 2019 · 7 comments

Comments

@arcadiaphy
Copy link
Member

http://jenkins.mxnet-ci.amazon-ml.com/blue/organizations/jenkins/mxnet-validation%2Funix-cpu/detail/PR-14894/1/pipeline

@lobanov-m, perhaps there are some errors introduced in #13226?

@mxnet-label-bot
Copy link
Contributor

Hey, this is the MXNet Label Bot.
Thank you for submitting the issue! I will try and suggest some labels so that the appropriate MXNet community members can help resolve it.
Here are my recommended labels: Test, Flaky

@lobanov-m
Copy link
Contributor

lobanov-m commented May 8, 2019

I will check, thank you.

@perdasilva
Copy link
Contributor

@perdasilva
Copy link
Contributor

@haojin2
Copy link
Contributor

haojin2 commented May 20, 2019

Another failure: http://jenkins.mxnet-ci.amazon-ml.com/blue/organizations/jenkins/mxnet-validation%2Fwindows-gpu/detail/PR-14959/12/pipeline
Seems like the error margin is pretty small with a very small atol:

======================================================================
FAIL: test_operator_gpu.test_bilinear_resize_op
----------------------------------------------------------------------
Traceback (most recent call last):
  File "C:\Python27\lib\site-packages\nose\case.py", line 197, in runTest
    self.test(*self.arg)
  File "C:\Python27\lib\site-packages\nose\util.py", line 620, in newfunc
    return func(*arg, **kw)
  File "C:\jenkins_slave\workspace\ut-python-gpu\tests\python\gpu\../unittest\common.py", line 177, in test_new
    orig_test(*args, **kwargs)
  File "C:\jenkins_slave\workspace\ut-python-gpu\tests\python\gpu\../unittest\test_operator.py", line 7498, in test_bilinear_resize_op
    check_bilinear_resize_modes_op(shape_0, shape_1=shape_1, mode='like')
  File "C:\jenkins_slave\workspace\ut-python-gpu\tests\python\gpu\../unittest\test_operator.py", line 7472, in check_bilinear_resize_modes_op
    check_numeric_gradient(resize_sym, [data_np, date_np_like])
  File "C:\jenkins_slave\workspace\ut-python-gpu\windows_package\python\mxnet\test_utils.py", line 980, in check_numeric_gradient
    ("NUMERICAL_%s"%name, "BACKWARD_%s"%name))
  File "C:\jenkins_slave\workspace\ut-python-gpu\windows_package\python\mxnet\test_utils.py", line 503, in assert_almost_equal
    raise AssertionError(msg)
AssertionError: 
Items are not equal:
Error 1.004549 exceeds tolerance rtol=0.010000, atol=0.000000.  Location of maximum error:(1, 0, 12, 12), a=0.004321, b=0.004278
 NUMERICAL_data: array([[[[0.84230304, 0.        , 0.57831407, ..., 0.1641959 ,
          0.        , 0.331074  ],
         [0.        , 0.        , 0.        , ..., 0.        ,...
 BACKWARD_data: array([[[[0.84230524, 0.        , 0.5783209 , ..., 0.16419744,
          0.        , 0.33107594],
         [0.        , 0.        , 0.        , ..., 0.        ,...
-------------------- >> begin captured logging << --------------------
common: INFO: Setting test np/mx/python random seeds, use MXNET_TEST_SEED=1608036320 to reproduce.
--------------------- >> end captured logging << ---------------------

Experimenting with bumping up the atol to a reasonable level now, please don't hurry to disable the test for now.

@haojin2
Copy link
Contributor

haojin2 commented May 21, 2019

@perdasilva Fix in #15011.

@haojin2
Copy link
Contributor

haojin2 commented May 31, 2019

Seems like there's no more occurrence of this issue, closing it for now.

@haojin2 haojin2 closed this as completed May 31, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

5 participants