You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @njeffrie, generally our recommendation is to port models as "custom code" models first. In these models, the code is included in the model repo itself, rather than in the core transformers library. You can write custom code models yourself without needing to open a PR to transformers, and users can still load them with AutoModel.from_pretrained(). Take a look at the docs here for more information.
Excellent, thanks for the pointer! I'll definitely do this first, then we will consider creating a PR to add the model to transformers in the slightly longer term.
For now I'll keep this open for reference and for the future PR, unless you'd prefer I close then re-open the issue alongside a future PR to transformers.
Model description
Model description can be found in the Moonshine Whitepaper.
I will be porting our existing torch model to Transformers.
Open source status
Provide useful links for the implementation
Implementation Special credit to @keveman for training and @evmaki for data collection and preprocessing.
Model weights
The text was updated successfully, but these errors were encountered: