Skip to content

Latest commit

 

History

History
16 lines (9 loc) · 1.3 KB

File metadata and controls

16 lines (9 loc) · 1.3 KB

RMSprop

RMSprop Example

RMSprop is an unpublished, adaptive learning rate optimization algorithm first proposed by Geoff Hinton in lecture 6 of his online class "Neural Networks for Machine Learning". RMSprop and Adadelta have been developed independently around the same time, and both try to resolve Adagrad's diminishing learning rate problem. [1]

The difference between Adadelta and RMSprop is that Adadelta removes the learning rate entirely and replaces it by the root mean squared error of parameter updates.

[1] Sebastian Ruder (2016). An overview of gradient descent optimization algorithms. arXiv preprint arXiv:1609.04747.

Code