You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Adept uses global gradient norm clipping of 0.5 by default which could prevent learning (just slowing it down) if the gradients are high and the clipping occurs at every training step. At minimum we should log this to tensorboard so the user can decide/view for themselves.
The text was updated successfully, but these errors were encountered:
Adept uses global gradient norm clipping of 0.5 by default which could prevent learning (just slowing it down) if the gradients are high and the clipping occurs at every training step. At minimum we should log this to tensorboard so the user can decide/view for themselves.
The text was updated successfully, but these errors were encountered: