You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @NullSenseStudio , thanks for your feedback. We'll be including a fix to address non-contiguous inputs in layer_norm in our upcoming torch-directml build releasing soon.
torch-directml: 0.2.1.dev240521
python: 3.11.7
0.6974620223045349 3.6198389530181885
The result isn't close to what it is on CPU or other devices.
7.381072464340832e-08 2.384185791015625e-07
But it will work as expected as long as it is made contiguous.
This non-contiguous input will cause an error instead.
The text was updated successfully, but these errors were encountered: