Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Various minor fixes for typos and types #105

Merged
merged 2 commits into from
Jul 8, 2022
Merged

Various minor fixes for typos and types #105

merged 2 commits into from
Jul 8, 2022

Conversation

brentyi
Copy link
Collaborator

@brentyi brentyi commented Jul 8, 2022

Just some minor things I noticed while making the fix in #104 🙂

# TODO(ethan): add loss weightings here from a config
# e.g. weighted_losses = map(lambda k: some_weight_dict[k] * loss_dict[k], loss_dict.keys())
weighted_losses = loss_dict.values()
return functools.reduce(torch.add, weighted_losses)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is mostly because the loss_sum = 0.0 + for loop pattern makes being strict about typing a bit tough, since I wanted to annotate the output as torch.Tensor but 0.0 isn't one

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is logic we intend to refactor - #90 (comment)

@@ -56,8 +56,7 @@ class TrainerConfig:
steps_per_save: int = MISSING
steps_per_test: int = MISSING
max_num_iterations: int = MISSING
# additional optional parameters here
resume_train: Optional[ResumeTrainConfig] = None
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the code pretty regularly accesses resume_train.* without any checks for resume_train is not None, so if this were actually None here I think we'd just get a bunch of runtime errors?

self.profiler_dict.keys(),
key=lambda k: self.profiler_dict[k]["val"],
reverse=True,
)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(this is the same, just seemed a little bit clearer)

@tancik
Copy link
Contributor

tancik commented Jul 8, 2022

LGTM

@brentyi brentyi merged commit 14d030f into master Jul 8, 2022
@brentyi brentyi deleted the brent/minor branch July 8, 2022 06:31
chris838 pushed a commit to chris838/nerfstudio that referenced this pull request Apr 22, 2023
chris838 pushed a commit to chris838/nerfstudio that referenced this pull request Apr 22, 2023
* minor fixes for typos and types

* run isort
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants