-
Notifications
You must be signed in to change notification settings - Fork 786
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
An error occurred during model execution: "RangeError: offset is out of bounds". #499
Comments
Hi there 👋 Indeed, this is a known issue, which originates from |
@satyajandhyala is looking at it |
Facing the same issue. Do we know when the latest version will be out? |
Same issue here. Is there a previous version that is known to work with the TinyLlama and Phi Demos? |
+1! |
this is still happening. 😞 is the fix released? |
|
I just checked I can run this model correctly with either wasm or webgpu backend. |
Question
Hello - having an issue getting this code to run in the browser. Using
Xenova/TinyLlama-1.1B-Chat-v1.0
on"@xenova/transformers": "^2.13.2"
It runs perfectly in node.
In Node it runs:
But in the browser I see this:
Same issue in Firefox.
This issue seems to say it's memory: #8
Is this one too large to run in the browser?
The text was updated successfully, but these errors were encountered: