-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Clearer disable validation logic #650
Clearer disable validation logic #650
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice update 👍
# run tiny validation (if validation defined) | ||
# to make sure program won't crash during val | ||
ref_model.on_sanity_check_start() | ||
ref_model.on_train_start() | ||
if self.get_val_dataloaders() is not None and self.num_sanity_val_steps > 0: | ||
if not self.disable_validation and self.num_sanity_val_steps > 0: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
consider user _ so self._disable_validation
but maybe it s not needed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks like _ prefix is not commonly used across the code so I believe it would be more consistent to leave it as it is
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know, that I left it to your consideration... you introduced this variable so you should know if it is exposed (without _ ) or internal (with _ ) one...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I understand. And I just mean that I would prefer to stick to the current codebase style, where there seems to be no distinction between exposed and internal variables :)
@@ -184,6 +184,7 @@ def __init__(self): | |||
self.num_training_batches = None | |||
self.val_check_batch = None | |||
self.num_val_batches = None | |||
self.disable_validation = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
earlier you have it as True/False
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, but it is the definition inside TrainerTrainLoopMixin
, where we have
# this is just a summary on variables used in this abstract class,
# the proper values/initialisation should be done in child class
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, but be consistent in your addition, use bool or obejct/None everywhere...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmmm... I define this only in 2 places: in TrainerTrainLoopMixin
and in Trainer
. Inside TrainerTrainLoopMixin
all fields are set to None
, and inside Trainer
we assign the actual value. Maybe I don't unserstand something?
@@ -266,76 +266,67 @@ def evaluate(self, model, dataloaders, max_batches, test=False): | |||
|
|||
def run_evaluation(self, test=False): | |||
# when testing make sure user defined a test step | |||
can_run_test_step = False | |||
if not (self.is_overriden('test_step') and self.is_overriden('test_end')): | |||
m = '''You called .test() without defining a test step or test_end. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pls add ` around functions and variables
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
@kuynzereb great job! sorry for the delay :) |
In this PR I suggest to make clearer disable validation logic. It will be a flag which will be set in the
pretrain_routine()
so during the training the trainer will know if there will be validation runs or not. Now validation will be disabled if:num_val_batches=0
. It either means that there is noval_dataloader
or that the user have setval_percent_check=0
(this part will work when Fix percent_checks #649 is merged).validation_step
was not overridenOnce this merged we will be able to finish #542.