Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pass current learning rate to schedule() in LearningRateScheduler #8865

Merged
merged 5 commits into from
Feb 2, 2018
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 7 additions & 3 deletions keras/callbacks.py
Original file line number Diff line number Diff line change
Expand Up @@ -558,8 +558,8 @@ class LearningRateScheduler(Callback):

# Arguments
schedule: a function that takes an epoch index as input
(integer, indexed from 0) and returns a new
learning rate as output (float).
(integer, indexed from 0) and current learning rate
and returns a new learning rate as output (float).
verbose: int. 0: quiet, 1: update messages.
"""

Expand All @@ -571,7 +571,11 @@ def __init__(self, schedule, verbose=0):
def on_epoch_begin(self, epoch, logs=None):
if not hasattr(self.model.optimizer, 'lr'):
raise ValueError('Optimizer must have a "lr" attribute.')
lr = self.schedule(epoch)
lr = float(K.get_value(self.model.optimizer.lr))
try: # new API
lr = self.schedule(epoch, lr=lr)
except TypeError: # old API for backward compatibility
lr = self.schedule(epoch)
if not isinstance(lr, (float, np.float32, np.float64)):
raise ValueError('The output of the "schedule" function '
'should be float.')
Expand Down