Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Fix softmax behavior to not cast up the accumulator if no output dtype is specified #14759

Closed
wants to merge 6 commits into from

Commits on Apr 22, 2019

  1. Configuration menu
    Copy the full SHA
    0901ca2 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    4754dc7 View commit details
    Browse the repository at this point in the history
  3. Update test_softmax_dtype: use AType in np_softmax, change tolerance …

    …values
    
    based on AType
    nswamy committed Apr 22, 2019
    Configuration menu
    Copy the full SHA
    2f66745 View commit details
    Browse the repository at this point in the history
  4. Configuration menu
    Copy the full SHA
    b3dc30d View commit details
    Browse the repository at this point in the history
  5. revert tests back to earlier since verifying against numpy on such a …

    …large tensor takes a long time and shows no change between this change and previous change
    nswamy committed Apr 22, 2019
    Configuration menu
    Copy the full SHA
    feda006 View commit details
    Browse the repository at this point in the history
  6. Configuration menu
    Copy the full SHA
    c47b024 View commit details
    Browse the repository at this point in the history