Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clearer disable validation logic #650

Merged
merged 5 commits into from
Jan 14, 2020

Conversation

kuynzereb
Copy link
Contributor

In this PR I suggest to make clearer disable validation logic. It will be a flag which will be set in the pretrain_routine() so during the training the trainer will know if there will be validation runs or not. Now validation will be disabled if:

  • num_val_batches=0. It either means that there is no val_dataloader or that the user have set val_percent_check=0 (this part will work when Fix percent_checks #649 is merged).
  • validation_step was not overriden

Once this merged we will be able to finish #542.

Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice update 👍

# run tiny validation (if validation defined)
# to make sure program won't crash during val
ref_model.on_sanity_check_start()
ref_model.on_train_start()
if self.get_val_dataloaders() is not None and self.num_sanity_val_steps > 0:
if not self.disable_validation and self.num_sanity_val_steps > 0:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

consider user _ so self._disable_validation but maybe it s not needed

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like _ prefix is not commonly used across the code so I believe it would be more consistent to leave it as it is

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know, that I left it to your consideration... you introduced this variable so you should know if it is exposed (without _ ) or internal (with _ ) one...

Copy link
Contributor Author

@kuynzereb kuynzereb Dec 26, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I understand. And I just mean that I would prefer to stick to the current codebase style, where there seems to be no distinction between exposed and internal variables :)

@@ -184,6 +184,7 @@ def __init__(self):
self.num_training_batches = None
self.val_check_batch = None
self.num_val_batches = None
self.disable_validation = None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

earlier you have it as True/False

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, but it is the definition inside TrainerTrainLoopMixin, where we have

# this is just a summary on variables used in this abstract class,
#  the proper values/initialisation should be done in child class

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, but be consistent in your addition, use bool or obejct/None everywhere...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmmm... I define this only in 2 places: in TrainerTrainLoopMixin and in Trainer. Inside TrainerTrainLoopMixin all fields are set to None, and inside Trainer we assign the actual value. Maybe I don't unserstand something?

@@ -266,76 +266,67 @@ def evaluate(self, model, dataloaders, max_batches, test=False):

def run_evaluation(self, test=False):
# when testing make sure user defined a test step
can_run_test_step = False
if not (self.is_overriden('test_step') and self.is_overriden('test_end')):
m = '''You called .test() without defining a test step or test_end.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pls add ` around functions and variables

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@williamFalcon williamFalcon merged commit 756c70a into Lightning-AI:master Jan 14, 2020
@williamFalcon
Copy link
Contributor

@kuynzereb great job! sorry for the delay :)

@kuynzereb kuynzereb deleted the improve_no_val_logic branch January 14, 2020 07:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants