You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Momentum shouldn't be stored in the layers any more. This will free us up to use a broader set of optimisation algorithms. We will however need to provide a class for fast updates and manipulations of learnable parameters.
Gradient associated type family shouldn't exist, we'll just return a Network with gradient weights.
randomNetwork shouldn't exist. Networks where all layers have a Random instance will also have a Random instance.
The text was updated successfully, but these errors were encountered:
HuwCampbell
changed the title
Num & Floating instances for Network
Remove momentum from Layers
Mar 8, 2017
Momentum shouldn't be stored in the layers any more. This will free us up to use a broader set of optimisation algorithms. We will however need to provide a class for fast updates and manipulations of learnable parameters.
Gradient
associated type family shouldn't exist, we'll just return a Network with gradient weights.randomNetwork
shouldn't exist. Networks where all layers have aRandom
instance will also have aRandom
instance.The text was updated successfully, but these errors were encountered: