-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Names of parameters may benefit from not being abbreviated #119
Comments
@alok sure, can you first list out the name changes here. from -> to: example:
|
|
awesome suggestions, Let's do these:
|
I propose to rename |
I'll hold off on this until #146 is resolved, since it affects this. |
I suggest gradient_clip_norm, because pytorch has torch.nn.utils.clip_grad_value_, which clips individual partial derivatives using torch.clamp and it would be confusing. |
Merged #124 |
I know this is nitpicky, but I think good naming is worth a lot of thought.
A lot of the API seems unhelpfully abbreviated to me, especially since lighting is designed so that you don't have to handle manual details like dataloaders more than necessary.
Names like
tng_dataloader
don't seem to buy anything overtrain_dataloader
ortraining_dataloader
since they're written only once and read many more times. Really,tng
could be replaced withtraining
ortrain
elsewhere too.data_batch
seems redundant, I think it could just be calledbatch
since in context it can only represent data anyway, andbatch_nb
is already a separate argument.Describe the solution you'd like
Rename. The library is still in early days.
The text was updated successfully, but these errors were encountered: