Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docs for new spacy-trf architectures #8954

Merged
merged 13 commits into from
Oct 18, 2021
Merged

Conversation

svlandeg
Copy link
Member

This PR should not be merged until spacy-transformers 1.1.0 is out. Keeping in draft until then.

Description

This PR also updates the quickstart with the v2 model. I guess this can go wrong if people upgrade spacy but not spacy-transformers - they'd end up with an architecture that won't be available. Not sure how to avoid that though, it also seems unfortunate to keep recommending an older architecture in the quickstart.

Types of change

docs update

Checklist

  • I have submitted the spaCy Contributor Agreement.
  • I ran the tests, and all new and existing tests passed.
  • My changes don't require a change to the documentation, or if they do, I've added all required information.

@svlandeg svlandeg added docs Documentation and website feat / transformer Feature: Transformer labels Aug 13, 2021
@adrianeboyd
Copy link
Contributor

Should spacy-transformers.Tok2VecTransformer.v1 get added to the legacy page?

@svlandeg
Copy link
Member Author

svlandeg commented Aug 17, 2021

@adrianeboyd : we discussed this on Slack and I thought the decision was not to move them to legacy as the backoff only handles spacy.? Anyway we probably do want to find a solution for this if we accumulate too many legacy architectures here.

EDIT: or did you mean only to the docs legacy page, not the actual legacy package?

@adrianeboyd
Copy link
Contributor

I mean the legacy page in the docs, not spacy-legacy.

@svlandeg
Copy link
Member Author

Yea, hm. I thought about that, but decided not to because the title reads "Archived implementations available through spacy-legacy" :/

@adrianeboyd
Copy link
Contributor

Hmm, I didn't realize that. Maybe then leave it documented on the transformer page? It seems bad that the v1 documentation is completely gone.

@adrianeboyd
Copy link
Contributor

There is a similar problem for the ModelOutput changes, how to document both versions of the TransformerData and FullTransformerBatch dataclasses.

@svlandeg
Copy link
Member Author

The v1 docs aren't completely gone though, there are two Accordion sections which state they are the same, minus the transformer_config argument, so I guess I was hoping that was sufficient :p

@adrianeboyd
Copy link
Contributor

Ah, I see. As long you as can actually find and see that using the site search (which honestly isn't working very well for me these days), then I think it's fine.

Disable mixed-precision training/prediction by default.
setup.cfg Outdated Show resolved Hide resolved
svlandeg and others added 2 commits October 4, 2021 14:20
Co-authored-by: Adriane Boyd <[email protected]>
@adrianeboyd adrianeboyd marked this pull request as ready for review October 18, 2021 08:44
@adrianeboyd adrianeboyd merged commit 3fd3531 into explosion:master Oct 18, 2021
@svlandeg svlandeg deleted the fix/trf-v2 branch October 18, 2021 12:15
adrianeboyd added a commit that referenced this pull request Oct 18, 2021
* use TransformerModel.v2 in quickstart

* update docs for new transformer architectures

* bump spacy_transformers to 1.1.0

* Add new arguments spacy-transformers.TransformerModel.v3

* Mention that mixed-precision support is experimental

* Describe delta transformers.Tok2VecTransformer versions

* add dot

* add dot, again

* Update some more TransformerModel references v2 -> v3

* Add mixed-precision options to the training quickstart

Disable mixed-precision training/prediction by default.

* Update setup.cfg

Co-authored-by: Adriane Boyd <[email protected]>

* Apply suggestions from code review

Co-authored-by: Adriane Boyd <[email protected]>

* Update website/docs/usage/embeddings-transformers.md

Co-authored-by: Adriane Boyd <[email protected]>

Co-authored-by: Daniël de Kok <[email protected]>
Co-authored-by: Adriane Boyd <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs Documentation and website feat / transformer Feature: Transformer
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants