Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

new LightningModule hook "configure_callbacks" #5621

Merged
merged 43 commits into from
Feb 13, 2021

Conversation

awaelchli
Copy link
Contributor

@awaelchli awaelchli commented Jan 23, 2021

What does this PR do?

Fixes #5615

Introduces a configure_callbacks hook in LightningModule
Possible implementations:

Option 1: "appending" (CURRENT IMPLEMENTATION ON THIS PR)
Simply appens the returned callbacks. Existing callbacks cannot be modified directly.

def configure_callbacks(self):
    my_callback = MyCallback()
    return [my_callback]

# this is what the trainer would do:
trainer.callbacks.extend(model.configure_callbacks())

Option 2: "replacement"
Allows to define the order and to modify existing callbacks.

def configure_callbacks(self, callbacks):
    my_callback = MyCallback()
    return [callbacks] + callbacks

# this is what the trainer would do:
trainer.callbacks = model.configure_callbacks()

Option 3: "inplace modification"
Allows to define the order and to modify existing callbacks.

def configure_callbacks(self, callbacks):
    my_callback = MyCallback()
    callbacks.append(my_callback)
    # no return

# this is what the trainer would do:
model.configure_callbacks(trainer.callbacks)

Options 2 and 3 give the user more control over how callbacks are used and in which order they are executed.
However, if the LightningModule modifies the the callbacks passed to the Trainer, it is less transparent what code actually gets executed. On the other hand, a user can mutate the callbacks list in the trainer anyway through Trainer.callbacks, if they want to.

TODOs:

  • decide on the interface, discussion needed (see proposed options below)
  • update docs
  • find the best place to invoke the hook
  • add tests
  • add changelog
  • Merge Force ModelCheckpoint callback to run last #5731, ensures that model checkpoints will run last

Poll options

    • ❤️
    • 🎉
    • 🚀

@awaelchli awaelchli added feature Is an improvement or enhancement callback labels Jan 23, 2021
@awaelchli awaelchli changed the base branch from master to release/1.2-dev January 23, 2021 03:07
@awaelchli awaelchli added this to the 1.2 milestone Jan 23, 2021
@Lightning-AI Lightning-AI deleted a comment from review-notebook-app bot Jan 23, 2021
@akihironitta
Copy link
Contributor

+1 for Option 1: "appending" since it is simple and similar to configure_optimizers() so that users don't get confused. Also, I'm curious about the motivation for adding this feature...

@awaelchli
Copy link
Contributor Author

I'm curious about the motivation for adding this feature...

This was requested internally, see grid chat. The motivation is to have model-specific callbacks defined in the model, making it more portable.

@ananthsub
Copy link
Contributor

@awaelchli what are the implications for checkpoint loading/saving? how are callback states treated here vs callbacks attached to the trainer? Asking since it came up in #5542

@awaelchli
Copy link
Contributor Author

awaelchli commented Jan 24, 2021

@ananthsub The callbacks attached through this hook here will still go into the trainer as if they were added directly to the Trainer.
Therefore, the same rules and limitations apply as for the Trainer.
As a general rule of thumb, the callbacks that do have state should not change their class name between saving and loading (as this is the key by which they are saved to the checkpoint and retrieved) and there can only be one callback of each type.
Callbacks that don't save state do not have these limitations, since they don't get saved.

@awaelchli
Copy link
Contributor Author

@PyTorchLightning/core-contributors would you find such a hook to configure model-specific callbacks directly in the LightningModule useful and if so which API would you prefer (3 choices above)?

@rohitgr7
Copy link
Contributor

option 1 looks better to me since trainer callbacks can be configured/modified using self.trainer.callbacks if required.

This was requested internally, see grid chat.

what's grid chat??

@carmocca
Copy link
Contributor

I agree with @akihironitta

@s-rog
Copy link
Contributor

s-rog commented Jan 26, 2021

While I would personally prefer 2 for maximum flexibility, I agree with Akihiro that it could be unintuitive for users.

@Borda Borda added the design Includes a design discussion label Jan 26, 2021
@teddykoker
Copy link
Contributor

teddykoker commented Jan 26, 2021

I think we can start with 1, and we can always add 2 later if it becomes heavily requested

@Borda
Copy link
Member

Borda commented Jan 26, 2021

@akihironitta @teddykoker @ananthsub @rohitgr7 @s-rog
I have added a small poll ❤️ 🎉 🚀 on the description, so please check it there 🐰

@Borda
Copy link
Member

Borda commented Jan 26, 2021

could it be a mix of the first two such that there will be an extra argument that would specify if the model CB is added as first or as last? ideally you would like to define some kind of the insert but it is maybe overkilled atm 🔢

@SeanNaren
Copy link
Contributor

I'd agree with option 1, with 2 being the future requirement for callbacks that interfere with each other!

@awaelchli awaelchli changed the title new LightningModule hook "configure_callbacks" new LightningModule hook "configure_callbacks" [skip ci] Jan 26, 2021
@awaelchli awaelchli changed the title new LightningModule hook "configure_callbacks" [skip ci] new LightningModule hook "configure_callbacks" Feb 6, 2021
@mergify mergify bot removed the has conflicts label Feb 6, 2021
Comment on lines +111 to +115
trainer_callbacks=[LearningRateMonitor(),
EarlyStopping(),
LearningRateMonitor(),
EarlyStopping()],
model_callbacks=[early_stopping, lr_monitor],
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@akihironitta @Borda why does yapf format it like this?
is it a bug?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that looks strange... :/

Comment on lines +121 to +126
trainer_callbacks=[
LearningRateMonitor(), progress_bar,
EarlyStopping(),
LearningRateMonitor(),
EarlyStopping()
],
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here, for some reason there are two elements on one line and the rest on the others. I thought it is supposed to give consistent formatting

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, this looks strange...

@mergify mergify bot added the has conflicts label Feb 8, 2021
@mergify mergify bot removed the has conflicts label Feb 9, 2021
@awaelchli awaelchli added the ready PRs ready to be merged label Feb 9, 2021
Base automatically changed from release/1.2-dev to master February 11, 2021 14:32
@mergify mergify bot removed the has conflicts label Feb 11, 2021
Copy link
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM !

@awaelchli awaelchli enabled auto-merge (squash) February 12, 2021 23:28
@awaelchli awaelchli merged commit b8619a6 into master Feb 13, 2021
@awaelchli awaelchli deleted the feature/configure_callbacks branch February 13, 2021 00:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
callback design Includes a design discussion feature Is an improvement or enhancement ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

configure_callbacks hook for LightningModule
10 participants