-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixing MuP #1061
base: main
Are you sure you want to change the base?
Fixing MuP #1061
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are accessibility issues in these changes.
mup/README.md
Outdated
This can be used to tune extremely large neural networks such as large pretrained transformers, as we have done in our work. | ||
More generally, μP reduces the fragility and uncertainty when transitioning from exploration to scaling up, which are not often talked about explicitly in the deep learning literature. | ||
|
||
![](figures/sp_vs_mup_dashed.png) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
mup/README.md
Outdated
|
||
μP turns out to be the *unique* "natural" parametrization that has this hyperparameter stability property across width, as empirically verified in the gif below on MLPs trained with SGD. Here, across time, we interpolate between PyTorch default and μP's learning rate and initialization scalings (right), and we scale up the width-256 model (log2(width)=8) to width 2^13 = 8192 using this interpolated scaling rule (left). | ||
|
||
![](figures/parametrizations.gif) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
mup/examples/MLP/README.md
Outdated
``` | ||
You should find the generated plots under `./coord_checks`, which show stable coordinate sizes under μP, e.g., | ||
|
||
![](coord_checks/μp_mlp_sgd_coord.png) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
mup/examples/MLP/README.md
Outdated
|
||
and growing sizes under SP, e.g., | ||
|
||
![](coord_checks/sp_mlp_sgd_coord.png) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
mup/README.md
Outdated
The first set of 3 plots shows an MLP in standard parametrization (SP), trained by adam. | ||
We see after 1 step of update, activation/output `l1` are exploding with width. | ||
This means SP is "incorrect." | ||
![](coord_checks/sp_mlp_adam_lr0.001_nseeds5_bn0_coord.png) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
mup/README.md
Outdated
![](coord_checks/sp_mlp_adam_lr0.001_nseeds5_bn0_coord.png) | ||
We now do the same for an MLP in maximal update parametrization (μP) (including using `mup.optim.MuAdam` instead of `torch.optim.Adam`). | ||
In contrast to the above, all curves stay horizontal, indicating that μP is implemented correctly. | ||
![](coord_checks/μp_mlp_adam_lr0.001_nseeds5_bn0_coord.png) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
mup/README.md
Outdated
|
||
### Wider is Always Better | ||
|
||
![](figures/widerbetter.png) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are accessibility issues in these changes.
4. Set coord-check to false | ||
What you are gonna get is some stastistics of pre-activations for models only differing by the width. If done correctly these should be approximately horizontal | ||
![](mup/figures/coord_check_up.0.jpg) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
What you are gonna get is some stastistics of pre-activations for models only differing by the width. If done correctly these should be approximately horizontal | ||
![](mup/figures/coord_check_up.0.jpg) | ||
<font size="1"> *Healthy coordinate check*</font> | ||
![](mup/figures/coord_check_sp.0.jpg) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
<font size="1"> *Something's wrong*</font> | ||
|
||
A second kind of test is to pick any configuration and learning rate (that doesn't lead to diverging training) and simply run a few different experiments fixing everything except for the width. Since with mup wider is always better the results should look like the figure below | ||
![](mup/figures/width_check.png) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are accessibility issues in these changes.
4. Set coord-check to false | ||
What you are gonna get is some stastistics of pre-activations for models only differing by the width. If done correctly these should be approximately horizontal | ||
![](mup/figures/coord_check_up.0.jpg) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
What you are gonna get is some stastistics of pre-activations for models only differing by the width. If done correctly these should be approximately horizontal | ||
![](mup/figures/coord_check_up.0.jpg) | ||
<font size="1"> *Healthy coordinate check*</font> | ||
![](mup/figures/coord_check_sp.0.jpg) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
<font size="1"> *Something's wrong*</font> | ||
|
||
A second kind of test is to pick any configuration and learning rate (that doesn't lead to diverging training) and simply run a few different experiments fixing everything except for the width. Since with mup wider is always better the results should look like the figure below | ||
![](mup/figures/width_check.png) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
@nsarka -- FYI |
Instead of incorporating muP into GPT-NeoX we are going to move these changes to our fork of their repo and install that version until the changes are upstreamed. |
Not all of his changes are muP-related. I've separated out the muP 1-line change into our fork until microsoft/mup#65 is merged. We can discuss the GPT-NeoX specific changes here and remove the mup subdir. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are accessibility issues in these changes.
4. Set coord-check to false | ||
What you are gonna get is some stastistics of pre-activations for models only differing by the width. If done correctly these should be approximately horizontal | ||
![](mup/figures/coord_check_up.0.jpg) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
What you are gonna get is some stastistics of pre-activations for models only differing by the width. If done correctly these should be approximately horizontal | ||
![](mup/figures/coord_check_up.0.jpg) | ||
<font size="1"> *Healthy coordinate check*</font> | ||
![](mup/figures/coord_check_sp.0.jpg) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
<font size="1"> *Something's wrong*</font> | ||
|
||
A second kind of test is to pick any configuration and learning rate (that doesn't lead to diverging training) and simply run a few different experiments fixing everything except for the width. Since with mup wider is always better the results should look like the figure below | ||
![](mup/figures/width_check.png) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are accessibility issues in these changes.
4. Set coord-check to false | ||
What you are gonna get is some stastistics of pre-activations for models only differing by the width. If done correctly these should be approximately horizontal | ||
![](mup/figures/coord_check_up.0.jpg) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
What you are gonna get is some stastistics of pre-activations for models only differing by the width. If done correctly these should be approximately horizontal | ||
![](mup/figures/coord_check_up.0.jpg) | ||
<font size="1"> *Healthy coordinate check*</font> | ||
![](mup/figures/coord_check_sp.0.jpg) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
<font size="1"> *Something's wrong*</font> | ||
|
||
A second kind of test is to pick any configuration and learning rate (that doesn't lead to diverging training) and simply run a few different experiments fixing everything except for the width. Since with mup wider is always better the results should look like the figure below | ||
![](mup/figures/width_check.png) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is missing a text alternative. This is a problem for people using screen readers.
Oh I see, I read the previous discussion backwards (thinking it was a 1-line NeoX edit and a substantial muP edit). I went ahead and removed the muP changes (moving them to the fork) and imported the new muP library. I haven't had a chance to check the correctness of this implementation yet however. |
That's fairly quick to verify, currently neox adjust the learning with the width here, but |
It looks like this is unchanged from your branch? I thought your branch was working. Or am I missing something. |
It's fixed if we use the eleuther fork of mup where I added the width to the optimizer dict https://github.com/EleutherAI/mup/blob/14e436bc013418725976e7cfb1b4e74e8901ab80/mup/optim.py#L75C9-L80C52. |
Current MuP implementation in neox is buggy. This PR allows to get the main functionalities without major changes to the code. Current limitations:
The main issue in the current code is that the model is always initialized with
use_mup = False
, which is then set to its correct value later. This doesn't work, as it sets the wrong attribute at the init of all classes, meaning that effectively it never used mup. Best solution would be to loop through all modules and set the correct attribute there. Current workaround provides a minimal modification whereby the attribute is reset at the re-init of the linear layers, meaning it does the correct thing for everything except for the self attention and embeddingA second issue is that the code as is expects the mu-optimizer to provide with the correct multiplier of target_width/base_width but this is not provided in the mup library. We should probably just open a PR on mup and get rid of this. As the fastest solution, mup is added to the repo, with the multiplier added to the optimizer dict. Plus, removed torch dependancies for mup cause that's useless and can only lead to issues.
Plots are also added for testing the implementation: