Skip to content

Fix #17893, removed dead code#17917

Merged
sgugger merged 3 commits intohuggingface:mainfrom
clefourrier:patch-1
Jun 29, 2022
Merged

Fix #17893, removed dead code#17917
sgugger merged 3 commits intohuggingface:mainfrom
clefourrier:patch-1

Conversation

@clefourrier
Copy link
Member

What does this PR do?

Fixes #17893

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

@ydshieh

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Jun 28, 2022

The documentation is not available anymore as the PR was closed or merged.

Copy link
Collaborator

@ydshieh ydshieh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thank you, @clefourrier !

I think we can also remove the line _keys_to_ignore_on_load_missing = [r"position_ids"] (line 1395), but let's wait for @sgugger to confirm.

@LysandreJik
Copy link
Member

LysandreJik commented Jun 28, 2022

Hey @clefourrier! The code quality error comes from a new release from black. Rebasing on main should solve the issue as you'll benefit from #17918.

@ydshieh
Copy link
Collaborator

ydshieh commented Jun 28, 2022

Regarding the test

from Lysandre on Slack

There was a new release from black that has a slightly different behavior for the --preview flag that we use in the CI.

If you see failures in the CI for the code quality test, with a large number of file changes (>500), please mention to the author that they just need to rebase on/merge main in order to benefit from the fix.

@ydshieh ydshieh requested a review from patrickvonplaten June 28, 2022 09:56
@clefourrier
Copy link
Member Author

@LysandreJik @ydshieh Should be good now! 😃
Ty both, I had missed it on the slack

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll defer to @patrickvonplaten who knows the model best. Changes look good to me.

One last thing to change is the line

_keys_to_ignore_on_load_missing = [r"position_ids"]

later on in the pretrained model, which should be now

_keys_to_ignore_on_load_unexpected = [r"position_ids"]

Just in case there are some longformer checkpoints in the wild with a position_ids key in their state dict.

@clefourrier
Copy link
Member Author

@sgugger Done :)

Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the fix! Great catch :-)

@sgugger sgugger merged commit eb1493b into huggingface:main Jun 29, 2022
viclzhu pushed a commit to viclzhu/transformers that referenced this pull request Jul 18, 2022
* Removed dead position_id code, fix huggingface#17893

* Removed unused var

* Now ignores removed (dead) dict key for backward comp
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Ambiguous positional embedding management in LongformerEmbeddings

6 participants