-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Various minor fixes for typos and types #105
Conversation
# TODO(ethan): add loss weightings here from a config | ||
# e.g. weighted_losses = map(lambda k: some_weight_dict[k] * loss_dict[k], loss_dict.keys()) | ||
weighted_losses = loss_dict.values() | ||
return functools.reduce(torch.add, weighted_losses) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is mostly because the loss_sum = 0.0
+ for loop pattern makes being strict about typing a bit tough, since I wanted to annotate the output as torch.Tensor
but 0.0
isn't one
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is logic we intend to refactor - #90 (comment)
@@ -56,8 +56,7 @@ class TrainerConfig: | |||
steps_per_save: int = MISSING | |||
steps_per_test: int = MISSING | |||
max_num_iterations: int = MISSING | |||
# additional optional parameters here | |||
resume_train: Optional[ResumeTrainConfig] = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the code pretty regularly accesses resume_train.*
without any checks for resume_train is not None
, so if this were actually None
here I think we'd just get a bunch of runtime errors?
self.profiler_dict.keys(), | ||
key=lambda k: self.profiler_dict[k]["val"], | ||
reverse=True, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(this is the same, just seemed a little bit clearer)
LGTM |
…_masking Move masking abstraction
* minor fixes for typos and types * run isort
Just some minor things I noticed while making the fix in #104 🙂