You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm cross-posting from a question on google groups because I believe there is a problem with checkgrad, although it's possible I have overlooked something. Considering this function
I'm cross-posting from a question on google groups because I believe there is a problem with checkgrad, although it's possible I have overlooked something. Considering this function
fx
is the dot product between flattenedx
and flattenedvect
, therefore the jacobian offx
should simply bevect
(ones of the size ofx
).Yet when checking with optim.checkgrad
dC_est
evaluates to 5.0 * (ones of the size of x). PyTorch's autograd returns the result I would expect, a tensor of ones:The text was updated successfully, but these errors were encountered: