Skip to content

Conversation

@DNA386
Copy link
Contributor

@DNA386 DNA386 commented Jul 29, 2025

This PR adds the option of including a learning rate scheduler to the training loop.
Requires adding an extra method called after each epoch is completed. By default do nothing to maintain previous functionality.
By default no schedule is set.

@DNA386 DNA386 marked this pull request as ready for review July 29, 2025 13:28
@neiljdo neiljdo self-requested a review July 30, 2025 22:39
@neiljdo neiljdo added the enhancement New feature or request label Jul 30, 2025
@neiljdo
Copy link
Collaborator

neiljdo commented Aug 8, 2025

@DNA386 thank you for this PR. I believe this only works with epoch-based schedulers and not with other types (e.g. step-based like OneCycleLR, etc.)?

@DNA386
Copy link
Contributor Author

DNA386 commented Aug 11, 2025

Ah you're right, I didn't see there was a different kind.
The cleanest way to unify this and also allow chaining multuple together might then be to add a new scheduler lambeq class, with subclass pytorchScheduler that can wrap (possibly multiple) pytorch schedulers.
scheduler class api will then get a position property (epoch vs step) and generic call function that takes in the current epoch, loss, and step (and whether this is the last step in this epoch). In this way can then potentially revert the code back and call the scheduler after each step.

I will implement and update the PR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants