Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Names of parameters may benefit from not being abbreviated #119

Closed
alok opened this issue Aug 15, 2019 · 8 comments
Closed

Names of parameters may benefit from not being abbreviated #119

alok opened this issue Aug 15, 2019 · 8 comments
Assignees
Labels
feature Is an improvement or enhancement help wanted Open to be worked on

Comments

@alok
Copy link
Contributor

alok commented Aug 15, 2019

I know this is nitpicky, but I think good naming is worth a lot of thought.

A lot of the API seems unhelpfully abbreviated to me, especially since lighting is designed so that you don't have to handle manual details like dataloaders more than necessary.

Names like tng_dataloader don't seem to buy anything over train_dataloader or training_dataloader since they're written only once and read many more times. Really, tng could be replaced with training or train elsewhere too.

data_batch seems redundant, I think it could just be called batch since in context it can only represent data anyway, and batch_nb is already a separate argument.

Describe the solution you'd like

Rename. The library is still in early days.

@alok alok added feature Is an improvement or enhancement help wanted Open to be worked on labels Aug 15, 2019
@williamFalcon
Copy link
Contributor

@alok sure, can you first list out the name changes here.

from -> to:

example:

tng_dataloader -> train_dataloader

@alok
Copy link
Contributor Author

alok commented Aug 15, 2019

  • tng -> training
  • pct -> percent or frac
  • nb -> num (this one i don't have as strong feelings about)
  • data_batch -> batch
  • prog -> progress
  • outputs in validation_end -> progress_metrics
  • dataloader -> loader (that it's a DataLoader is clear, so the data part is redundant, but also no really strong feeling)
  • current_epoch -> epoch (the only mixup that could be had is with the total number of epochs, and that could be called something like epochs)
  • gradient_clip -> gradient_clip_val (gradient_clip sounds like a boolean indicating whether to clip or not)
  • gpus in trainer -> gpu_ids
  • add_log_row_interval -> row_log_interval

@williamFalcon
Copy link
Contributor

williamFalcon commented Aug 15, 2019

awesome suggestions,

Let's do these:

keep validation_step 
keep training_step
in trainer options use: train, test, val  
for data: val_dataloader, test_dataloader, train_dataloader
keep pct   
keep nb   
data_batch -> batch    
prog -> progress   
keep progress_metrics    
keep dataloader    
keep current_epoch    
gradient_clip -> gradient_clip_val    
keep gpus
add_log_row_interval -> row_log_interval

@Ir1d
Copy link
Contributor

Ir1d commented Aug 17, 2019

I propose to rename update_tng_log_metrics to update_train_log_metrics

@alok
Copy link
Contributor Author

alok commented Aug 20, 2019

I'll hold off on this until #146 is resolved, since it affects this.

@IvanLazarevsky
Copy link

Let's do these:

gradient_clip -> gradient_clip_val    

I suggest gradient_clip_norm, because pytorch has torch.nn.utils.clip_grad_value_, which clips individual partial derivatives using torch.clamp and it would be confusing.

@williamFalcon
Copy link
Contributor

Merged #124

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement help wanted Open to be worked on
Projects
None yet
Development

No branches or pull requests

4 participants