Skip to content

Conversation

@SunMarc
Copy link
Member

@SunMarc SunMarc commented Jun 5, 2023

What does this PR do

This is the fix in the transformers library of this PR.

It will fix the case where a user uses their own device map (in the from_pretrained method) but forget that parameters that are tied together should be on the same device. We return an error showing which parameters should be on the same device.

@SunMarc SunMarc requested a review from sgugger June 5, 2023 19:35
@SunMarc
Copy link
Member Author

SunMarc commented Jun 5, 2023

The test are failing because I used a function that i have added recently in accelerate.utils. Should we use the main for the tests @sgugger ?

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your PR!
You will need to protect the import a bit better as there are many checks (and users!) that do not use the latest of Accelerate, and we want to not break everything for them ;-)
You can put the import in a try except and have the function be None when it's not in the version of Accelerate used.

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Jun 5, 2023

The documentation is not available anymore as the PR was closed or merged.

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for iterating!

@SunMarc SunMarc merged commit 6307312 into huggingface:main Jun 6, 2023
@SunMarc SunMarc deleted the check_tied_parameters branch June 6, 2023 13:12
novice03 pushed a commit to novice03/transformers that referenced this pull request Jun 23, 2023
* Add check for tied parameters

* Fix style

* fix style

* Fix versioning

* Change if to elif
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants