This repository has been archived by the owner on Oct 9, 2023. It is now read-only.
Replies: 1 comment
-
I believe, a scheduler should be exposed to the tasks as an optional argument in https://github.com/PyTorchLightning/lightning-flash/blob/master/flash/core/model.py. That would enable this to be much cleaner. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
This project is sophisticated, exciting for me.
How do I use the LR Scheduler on a general task training?
As far as I saw,
flash.Task
is not designed to allow me to configure the LR scheduler.Also, there is no way to set any parameters other than the LR on the optimizer (like momentum, weight_decay, etc.)
To get around this, I created a class inherited from
flash.Task
as follows:and use like:
(Reason why I wanted to adopt this was to get reproduction: https://github.com/kuangliu/pytorch-cifar)
Actually, I don't like this extension.
This loses the simplicity of the flash.
Is there any good ideas? Thank you.
Beta Was this translation helpful? Give feedback.
All reactions