-
Notifications
You must be signed in to change notification settings - Fork 31.9k
Fix TFBert past_key_values
#16230
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix TFBert past_key_values
#16230
Conversation
|
The documentation is not available anymore as the PR was closed or merged. |
| # The received `past_key_values` is a tuple of 2 elements. | ||
| # The 1st element is `encoder_hidden_states`. The 2nd element is a tuple of `n_layers` elements, | ||
| # each element is a tuple of 4 tensors of shape (batch_size, n_heads, seq_len - 1, embed_size_per_head) | ||
| if type(past_key_values) == tuple and len(past_key_values) == 2 and type(past_key_values[1]) == tuple: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This shoudn't be the case. past_key_values should only be the 2nd element you describe above. @gante - did we miss this model in the past/encoder_outputs refactor?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TFBert and TFEncoderDecoder were both updated (PR). The test in question also doesn't have the num_beams nor the sample arguments in generate, meaning that it should be going through the updated greedy_search, which has the updated past format 🤔
@ydshieh, can I ask you to confirm that generate() calls the new _generate() internally (this one)? If not, it means that it is going through the old code, which may have some past-related issue (I made some ad hoc changes to comply with the new past format, but they may be incomplete.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can check the logic flow, will report back.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- calls the new _generate() -> Yes
- then
return self.greedy_searchhere - This line model_inputs = self.prepare_inputs_for_generation gives
past_key_values:- 1st step: None
- 2nd step: a tuple of 2 elements (which shouldn't be)
It is probably this method TFEncoderDecoderModel.prepare_inputs_for_generation should be corrected, rather than in TFBertModel.
Considering it was me who worked on TFEncoderDecoderModel, I can fix it instead.
@gante, WDYT? If yes, do you have some hints/tips for me about the changes on the format you have done?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So the problem should be related to this line, and it probably implies I missed some updates 🙈
I can look into it, it is almost surely related to a problem in my past PR
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, thank you!
patrickvonplaten
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should not need this hack. @gante can you check whether the prepare_inputs method might need an update here?
|
Here is the code I used for this issue/PR from transformers import EncoderDecoderModel, AutoTokenizer, TFEncoderDecoderModel
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
model = TFEncoderDecoderModel.from_pretrained("ydshieh/bert2bert-cnn_dailymail-fp16")
ARTICLE_STUDENTS = """(CNN)Sigma Alpha Epsilon is under fire for a video showing party-bound fraternity members singing a racist chant. SAE's national chapter suspended the students, but University of Oklahoma President David Boren took it a step further, saying the university's affiliation with the fraternity is permanently done. The news is shocking, but it's not the first time SAE has faced controversy. SAE was founded March 9, 1856, at the University of Alabama, five years before the American Civil War, according to the fraternity website. When the war began, the group had fewer than 400 members, of which "369 went to war for the Confederate States and seven for the Union Army," the website says. The fraternity now boasts more than 200,000 living alumni, along with about 15,000 undergraduates populating 219 chapters and 20 "colonies" seeking full membership at universities. SAE has had to work hard to change recently after a string of member deaths, many blamed on the hazing of new recruits, SAE national President Bradley Cohen wrote in a message on the fraternity's website. The fraternity's website lists more than 130 chapters cited or suspended for "health and safety incidents" since 2010. At least 30 of the incidents involved hazing, and dozens more involved alcohol. However, the list is missing numerous incidents from recent months. Among them, according to various media outlets: Yale University banned the SAEs from campus activities last month after members allegedly tried to interfere with a sexual misconduct investigation connected to an initiation rite. Stanford University in December suspended SAE housing privileges after finding sorority members attending a fraternity function were subjected to graphic sexual content. And Johns Hopkins University in November suspended the fraternity for underage drinking. "The media has labeled us as the 'nation's deadliest fraternity,' " Cohen said. In 2011, for example, a student died while being coerced into excessive alcohol consumption, according to a lawsuit. SAE's previous insurer dumped the fraternity. "As a result, we are paying Lloyd's of London the highest insurance rates in the Greek-letter world," Cohen said. Universities have turned down SAE's attempts to open new chapters, and the fraternity had to close 12 in 18 months over hazing incidents."""
input_dict = tokenizer(ARTICLE_STUDENTS, return_tensors="tf")
output_ids = model.generate(input_ids=input_dict["input_ids"], max_length=None).numpy().tolist()
summary = tokenizer.batch_decode(output_ids, skip_special_tokens=True) |
|
Agreed that we shouldn't need this hack. @ydshieh LMK if you want me to look into |
Yes, please :-) - I can check the logic flow in this code snippet though! |
|
close since the root cause is found and being addressed in #16260 |
What does this PR do?
Current
TFBertEncoderDecoderModelTest.test_bert2bert_summarizationonmasterwill fail becausepast_key_valuesreceived by TF Bert is a tuple of 2 elements:encoder_hidden_statespast_key_valuesin Pytorch versionThe remaining code in TF Bert expects
past_key_valuesnot containing the 1st element.Prior to #15944 , the
past_key_valuesis only the 2nd element in the current version.This PR fixes the test, but it looks like we should prepare the correct inputs rather than this hacking fix.
I would like to have your (in particular @gante ) confirmation before going further, thanks.