Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Revert "[Submodule] Upgrade to oneDNN v1.6.3" #19180

Closed
wants to merge 1 commit into from

Conversation

leezu
Copy link
Contributor

@leezu leezu commented Sep 18, 2020

Tests if #19153 introduced the gcc8 incompatibility

[2020-09-17T17:48:04.979Z] ______________________ test_dc_hybridblock_deferred_init _______________________
[2020-09-17T17:48:04.979Z] [gw0] linux -- Python 3.6.9 /opt/rh/rh-python36/root/usr/bin/python3
[2020-09-17T17:48:04.979Z] 
[2020-09-17T17:48:04.979Z]     def test_dc_hybridblock_deferred_init():
[2020-09-17T17:48:04.979Z]         class MyBlock(mx.gluon.HybridBlock):
[2020-09-17T17:48:04.979Z]             def __init__(self):
[2020-09-17T17:48:04.979Z]                 super().__init__()
[2020-09-17T17:48:04.979Z]                 self.dense = mx.gluon.nn.Dense(units=10)
[2020-09-17T17:48:04.979Z]                 self.weight = mx.gluon.Parameter('weight', allow_deferred_init=True)
[2020-09-17T17:48:04.979Z]     
[2020-09-17T17:48:04.979Z]             def infer_shape(self, x):
[2020-09-17T17:48:04.979Z]                 self.weight.shape = (x.shape[1], )
[2020-09-17T17:48:04.979Z]     
[2020-09-17T17:48:04.979Z]             def forward(self, x):
[2020-09-17T17:48:04.979Z]                 return self.dense(x) + self.weight.data(x.context)
[2020-09-17T17:48:04.979Z]     
[2020-09-17T17:48:04.979Z]         net = MyBlock()
[2020-09-17T17:48:04.979Z]         net.initialize()
[2020-09-17T17:48:04.979Z] >       _assert_dc_gluon(_dc_gluon_simple_setup, net, numpy=False)
[2020-09-17T17:48:04.979Z] 
[2020-09-17T17:48:04.979Z] tests/python/unittest/test_deferred_compute.py:504: 
[2020-09-17T17:48:04.979Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[2020-09-17T17:48:04.979Z] tests/python/unittest/test_deferred_compute.py:421: in _assert_dc_gluon
[2020-09-17T17:48:04.979Z]     _all_same(ys_np, ys_hybrid_np)
[2020-09-17T17:48:04.979Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[2020-09-17T17:48:04.979Z] 
[2020-09-17T17:48:04.979Z] arrays1 = [array([        nan,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z]         0.07460972, -0.08127148, -0.32424796,...33878, -0.10624887,
[2020-09-17T17:48:04.979Z]         0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z]       dtype=float32), ...]
[2020-09-17T17:48:04.979Z] arrays2 = [array([ 0.01286458,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z]         0.07460972, -0.08127148, -0.32424796,...33878, -0.10624887,
[2020-09-17T17:48:04.979Z]         0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z]       dtype=float32), ...]
[2020-09-17T17:48:04.979Z] message = ''
[2020-09-17T17:48:04.979Z] 
[2020-09-17T17:48:04.979Z]     def _all_same(arrays1, arrays2, message=''):
[2020-09-17T17:48:04.979Z]         same = all(np.array_equal(a1, a2) for a1, a2 in zip(arrays1, arrays2))
[2020-09-17T17:48:04.979Z]         if not same:
[2020-09-17T17:48:04.979Z] >           raise AssertionError('Arrays not equal ({}):\n{}\n\n{}'.format(message, arrays1, arrays2))
[2020-09-17T17:48:04.979Z] E           AssertionError: Arrays not equal ():
[2020-09-17T17:48:04.979Z] E           [array([        nan,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32), array([        nan,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32), array([        nan,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32), array([ 0.01286458,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32), array([ 0.01286458,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32), array([ 0.01286458,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32), array([ 0.01286458,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32), array([ 0.01286458,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32)]
[2020-09-17T17:48:04.979Z] E           
[2020-09-17T17:48:04.979Z] E           [array([ 0.01286458,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32), array([ 0.01286458,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32), array([ 0.01286458,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32), array([ 0.01286458,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32), array([ 0.01286458,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32), array([ 0.01286458,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32), array([ 0.01286458,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32), array([ 0.01286458,  0.2107217 , -0.06851891,  0.16233878, -0.10624887,
[2020-09-17T17:48:04.979Z] E                   0.07460972, -0.08127148, -0.32424796, -0.0124862 , -0.1862593 ],
[2020-09-17T17:48:04.979Z] E                 dtype=float32)]

@mxnet-bot
Copy link

Hey @leezu , Thanks for submitting the PR
All tests are already queued to run once. If tests fail, you can trigger one or more tests again with the following commands:

  • To trigger all jobs: @mxnet-bot run ci [all]
  • To trigger specific jobs: @mxnet-bot run ci [job1, job2]

CI supported jobs: [centos-gpu, windows-gpu, unix-gpu, unix-cpu, website, sanity, clang, centos-cpu, windows-cpu, edge, miscellaneous]


Note:
Only following 3 categories can trigger CI :PR Author, MXNet Committer, Jenkins Admin.
All CI tests must pass before the PR can be merged.

@leezu leezu marked this pull request as draft September 18, 2020 18:40
@leezu
Copy link
Contributor Author

leezu commented Sep 18, 2020

@mxnet-bot run ci [windows-gpu]

@mxnet-bot
Copy link

Jenkins CI successfully triggered : [windows-gpu]

@leezu
Copy link
Contributor Author

leezu commented Sep 19, 2020

Here is a screenshot that shows that the build passes when OneDNN update is reverted:

image

@leezu leezu closed this Sep 19, 2020
@szha szha deleted the revert-19153-dnnl-v1.6 branch September 19, 2020 03:01
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants