-
-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Docs for new spacy-trf architectures #8954
Conversation
Should |
@adrianeboyd : we discussed this on Slack and I thought the decision was not to move them to legacy as the backoff only handles EDIT: or did you mean only to the docs legacy page, not the actual legacy package? |
I mean the legacy page in the docs, not |
Yea, hm. I thought about that, but decided not to because the title reads "Archived implementations available through spacy-legacy" :/ |
Hmm, I didn't realize that. Maybe then leave it documented on the transformer page? It seems bad that the v1 documentation is completely gone. |
There is a similar problem for the |
The v1 docs aren't completely gone though, there are two Accordion sections which state they are the same, minus the |
Ah, I see. As long you as can actually find and see that using the site search (which honestly isn't working very well for me these days), then I think it's fine. |
Disable mixed-precision training/prediction by default.
Co-authored-by: Adriane Boyd <[email protected]>
Co-authored-by: Adriane Boyd <[email protected]>
Co-authored-by: Adriane Boyd <[email protected]>
* use TransformerModel.v2 in quickstart * update docs for new transformer architectures * bump spacy_transformers to 1.1.0 * Add new arguments spacy-transformers.TransformerModel.v3 * Mention that mixed-precision support is experimental * Describe delta transformers.Tok2VecTransformer versions * add dot * add dot, again * Update some more TransformerModel references v2 -> v3 * Add mixed-precision options to the training quickstart Disable mixed-precision training/prediction by default. * Update setup.cfg Co-authored-by: Adriane Boyd <[email protected]> * Apply suggestions from code review Co-authored-by: Adriane Boyd <[email protected]> * Update website/docs/usage/embeddings-transformers.md Co-authored-by: Adriane Boyd <[email protected]> Co-authored-by: Daniël de Kok <[email protected]> Co-authored-by: Adriane Boyd <[email protected]>
This PR should not be merged until
spacy-transformers
1.1.0 is out. Keeping in draft until then.Description
spacy-transformers
to 1.1.0This PR also updates the quickstart with the v2 model. I guess this can go wrong if people upgrade
spacy
but notspacy-transformers
- they'd end up with an architecture that won't be available. Not sure how to avoid that though, it also seems unfortunate to keep recommending an older architecture in the quickstart.Types of change
docs update
Checklist