-
Notifications
You must be signed in to change notification settings - Fork 750
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request] Add SmolLM
model and WebLLM
#850
Labels
Milestone
Comments
Hey @lightaime , this model is supported by ollama, should we do native integration? refer: https://ollama.com/library/smollm |
SmolLM
modelSmolLM
model and WebGPU
SmolLM
model and WebGPUSmolLM
model and WebLLM
Here are the key features of WebLLM :
|
13 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
Required prerequisites
Motivation
Add
SmolLM
https://huggingface.co/blog/smollm, https://huggingface.co/spaces/HuggingFaceTB/SmolLM-360M-Instruct-WebGPU. And WebGPU support.https://webllm.mlc.ai/
Solution
No response
Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: