Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for gradient clipping #1331

Merged
merged 2 commits into from
Feb 1, 2023

Conversation

hturki
Copy link
Contributor

@hturki hturki commented Feb 1, 2023

I've found that this makes training more stable in certain cases, especially when using high learning rates

Copy link
Contributor

@tancik tancik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, are you also planning to update the nerfacto config?

@hturki
Copy link
Contributor Author

hturki commented Feb 1, 2023

i wasn't planning on it, have been testing on fairly specific usecases so far with a hash-based (but not nerfacto) architecture, and I'm not sure whether this is needed for nerfacto / the specific clipping values I'm using will generalize. For nerfacto I'd maybe start with adding weight decay per #873 and then we could figure out reasonable clipping values if training is still unstable.

@tancik
Copy link
Contributor

tancik commented Feb 1, 2023

Gotcha 👍

@tancik tancik merged commit 48ec36e into nerfstudio-project:main Feb 1, 2023
@hturki hturki deleted the ht/gradient-clip branch February 1, 2023 23:53
chris838 pushed a commit to chris838/nerfstudio that referenced this pull request Apr 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants