-
Notifications
You must be signed in to change notification settings - Fork 6.8k
flaky test: test_operator.test_activation #13915
Comments
Creating PR to disable until a fix can be provided. |
Close it since the flaky test has been disabled. |
So this got closed because the fundamental tests (that should pass) are disabled. Not really what seems appropriate given that are some basic computations that should always work. Well this is how the code for the activation of type "softrelu" looks like (I traced it to ensure that this is exactly the code that gets executed). The gradient computation is just plain wrong! /*! \brief SoftReLU, also known as softplus activation */ MXNET_UNARY_MATH_OP(softrelu_grad, -math::expm1(-a)); |
Realized I misunderstood the outermost logic. Went back to the original pull request from 2015 to figure out that the argument supplied to _grad function is the computed value of the forward pass, not the original argument. So the current code is correct. |
http://jenkins.mxnet-ci.amazon-ml.com/blue/organizations/jenkins/mxnet-validation%2Fwindows-gpu/detail/PR-13609/7/pipeline
The text was updated successfully, but these errors were encountered: