Skip to content

Conversation

@patrickvonplaten
Copy link
Contributor

@patrickvonplaten patrickvonplaten commented Nov 8, 2021

What does this PR do?

This PR is a first attempt to fix: #13839. In short T5 models that don't have input and output embeddings tied, can resize the embeddings.

Overall this whole TF resize embedding layer logic is incredibly complex and not readable...IMO, we should do a bigger refactor here.

TODO:

  • Add test

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

)

if old_lm_head_decoder is not None and not is_input_output_equals:
if old_lm_head_decoder is not None and (not is_input_output_equals or not self.config.tie_word_embeddings):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think is_input_output_equals is enough to decide whether the input and outputs are tied or not -> it's mostly the tie_word_embeddings attribute that does that

if old_lm_head_decoder is not None and (not is_input_output_equals or not self.config.tie_word_embeddings):
old_embedding_dim = shape_list(old_lm_head_decoder)[1]
decoder_mask, current_decoder = init_copy_embeddings(old_lm_head_decoder, new_num_tokens)
name = old_lm_head_decoder.name.split(":")[0] if not tf.executing_eagerly() else None
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't pass name in eager mode so disable it

return embeds

# if embedding_layer is already a `tf.Tensor` simply output it
if isinstance(embedding_layer, tf.Tensor):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

embedding_layer can be a tensor for T5 I think -> so return just return this?

@patrickvonplaten patrickvonplaten changed the title fix t5 embeddings [TF]fix t5 embeddings Nov 8, 2021
@patrickvonplaten patrickvonplaten changed the title [TF]fix t5 embeddings [TF] Fix t5 embeddings Nov 8, 2021
Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing! Is the method value or values? The two are used in the diff.

@patrickvonplaten patrickvonplaten changed the title [TF] Fix t5 embeddings [WIP][TF] Fix t5 embeddings Nov 8, 2021
@huggingface huggingface deleted a comment from github-actions bot Dec 10, 2021
@github-actions
Copy link
Contributor

github-actions bot commented Jan 3, 2022

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

@patrickvonplaten
Copy link
Contributor Author

Superseeded by: #15567

@patrickvonplaten patrickvonplaten deleted the fix_tf_mt5_resize_word_embeddings branch February 9, 2022 11:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

TF mT5 model is not adding new tokens into it's vocabulary.

2 participants