Skip to content

Conversation

@satpalsr
Copy link
Contributor

Check params was removed for Bloom.
Not evident why.

@satpalsr satpalsr marked this pull request as draft March 17, 2023 18:06
@satpalsr satpalsr marked this pull request as ready for review March 17, 2023 18:07
Copy link
Collaborator

@jeffra jeffra left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch, this was accidentally added. Thank you @satpalsr

@jeffra jeffra removed the merge-queue label Apr 7, 2023
@jeffra
Copy link
Collaborator

jeffra commented Apr 7, 2023

Removing this return seems to cause failures on some of our unit tests, need to investigate further before merging.

@lekurile lekurile self-assigned this Aug 18, 2023
@lekurile
Copy link
Contributor

@satpalsr Looks like this check was removed in GH-2083 and was likely done due to certain models having mixed dtypes for various parameters. I think this likely warrants a deeper discussion about what purpose __check_params() serves in the runtime engine and if we should look to remove it altogether.

In the meantime, I think we can close this PR since simply removing the return in the function will result in errors in the unit tests. E.g. TestSaveTensorClone.test_save_tensor_clone[False-2]:

ValueError: torch.float32 is enabled but the following parameters have dtype that is not torch.float32: [('linears.0.weight', torch.float16), ('linears.0.bias', torch.float16), ('linears.1.weight', torch.float16), ('linears.1.bias', torch.float16), ('linears.2.weight', torch.float16), ('linears.2.bias', torch.float16), ('linears.3.weight', torch.float16), ('linears.3.bias', torch.float16)]

@lekurile lekurile closed this Sep 12, 2023
@lekurile lekurile reopened this Sep 12, 2023
@loadams loadams closed this Nov 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants