Skip to content

Conversation

@stas00
Copy link
Contributor

@stas00 stas00 commented Sep 15, 2020

Once @LysandreJik merges #6940, please merge these cards

These are models ported from https://github.com/jungokasai/deep-shallow/

The models are already on s3 under allenai - thank you for moving those from my username, @sshleifer.

And added 2 more models from the same author.

Fixes #7049

@julien-c julien-c added the model card Related to pretrained model cards label Sep 15, 2020
@codecov
Copy link

codecov bot commented Sep 15, 2020

Codecov Report

Merging #7153 into master will decrease coverage by 1.35%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #7153      +/-   ##
==========================================
- Coverage   80.86%   79.50%   -1.36%     
==========================================
  Files         169      169              
  Lines       32293    32293              
==========================================
- Hits        26114    25675     -439     
- Misses       6179     6618     +439     
Impacted Files Coverage Δ
...c/transformers/modeling_tf_transfo_xl_utilities.py 10.00% <0.00%> (-76.00%) ⬇️
src/transformers/modeling_tf_xlnet.py 20.85% <0.00%> (-71.41%) ⬇️
src/transformers/modeling_tf_transfo_xl.py 19.85% <0.00%> (-68.29%) ⬇️
src/transformers/modeling_tf_gpt2.py 71.59% <0.00%> (-23.38%) ⬇️
src/transformers/modeling_lxmert.py 70.01% <0.00%> (-20.75%) ⬇️
src/transformers/modeling_mobilebert.py 79.21% <0.00%> (-10.25%) ⬇️
src/transformers/modeling_openai.py 72.25% <0.00%> (-10.00%) ⬇️
src/transformers/modeling_tf_utils.py 86.68% <0.00%> (-0.65%) ⬇️
src/transformers/generation_utils.py 96.92% <0.00%> (-0.28%) ⬇️
src/transformers/generation_tf_utils.py 86.46% <0.00%> (-0.26%) ⬇️
... and 10 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 0203ad4...b4e0ad4. Read the comment docs.

@stas00
Copy link
Contributor Author

stas00 commented Sep 16, 2020

I wasn't sure about the format for multiple languages in the head of the card, I added:

language: en, de

Is this correct?

I couldn't find an example of the same situation. Perhaps this can be documented?

@julien-c

@stas00 stas00 mentioned this pull request Sep 16, 2020
@sshleifer
Copy link
Contributor

@stas00
Copy link
Contributor Author

stas00 commented Sep 17, 2020

I think you want like this

Do you mean:

language: - en - de

?

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great, thanks!

@LysandreJik LysandreJik merged commit 0fe6e43 into huggingface:master Sep 17, 2020
@stas00 stas00 deleted the allenai-wmt16-cards branch September 17, 2020 16:04
sshleifer pushed a commit to sshleifer/transformers_fork that referenced this pull request Sep 17, 2020
…ggingface#7153)

* [model cards] ported allenai Deep Encoder, Shallow Decoder models

* typo

* fix references

* add allenai/wmt19-de-en-6-6 model cards

* fill-in the missing info for the build script as provided by the searcher.
Zigur pushed a commit to Zigur/transformers that referenced this pull request Oct 26, 2020
…ggingface#7153)

* [model cards] ported allenai Deep Encoder, Shallow Decoder models

* typo

* fix references

* add allenai/wmt19-de-en-6-6 model cards

* fill-in the missing info for the build script as provided by the searcher.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

model card Related to pretrained model cards

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Convert 12-1 and 6-1 en-de models from AllenNLP

4 participants