-
-
Notifications
You must be signed in to change notification settings - Fork 207
Add: WASM Emscripten example #347
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This comment was marked as off-topic.
This comment was marked as off-topic.
I imagine an even more minimal example / test setup than YoloV8 that is first compiled and then run in the CI within a headless Chromium. We could check the developer console output to determine whether the model was correctly inferred. What do you think? Which model would be suited? Any good example how to use headless Chromium in GitHub workflows? Would this become its own workflow or rather a job in an existing workflow? PS: I guess we could use puppeteer to execute JavaScript code without any |
|
YOLOv8 is already a pretty good example model - not too simple like the MNIST model, but not too large like GPT-2. I don't know off the top of my head if either supports WASM threads but we should give Node.js or ideally Deno a shot before jumping straight to headless Chromium for CI. Btw, great work here! Until now I didn't think it was even possible to get |
Sure, I can use Deno to test the
|
cbf5276 to
4337eca
Compare
I have observed that you have added mechanisms to provide a precompiled static libonnxruntime for the Emscripten environment. However, when I try to use the |
|
Like I said in #349 I couldn't get binaries for 1.20.2 but they should be available for 1.21. Keep the current binary mechanism, and don't worry about CI, I'll handle those when 1.21 rolls around 👍 |
|
FYI, |
c63d5ad to
78b0f6f
Compare
|
I have pulled the latest changes from I have removed the Another cause for the issue could be the upgrade to Emscripten v4.0.3. Any glue? |
|
Those symbols are C++'s std, not Rust. It's definitely caused by the Emscripten upgrade. Was the ONNX Runtime binary compiled with Emscripten 4.0.3? |
|
It should have been, according to the log files. Update: Soon it must be Emscripten 4.0.4 :) |
|
Is there any progress? I tried to reproduce but failed. I am looking forward to the final merge. |
Pure WASI is a no, unfortunately. |
|
@raphaelmenges Did you ever get to the bottom of that linking issue? And do you need my help on anything here? Looks like ONNX Runtime v1.21 is |
|
I have not further investigated the issue, yet. I can do so next week sometime, if this is early enough for you! |
862a70b to
8a174a9
Compare
|
I have upgraded my code to include your latest changes and your precompiled I then removed linking to I have then checked the logs of your precompiled In the release note of v1.21 it is also mentioned that |
|
Dawn is indeed required for Emscripten too. Does the Let's get WASM32 working before looking into WASM64. Browser support doesn't seem too good anyways (though better than WebGPU to be fair 😶) |
|
I tried Anyway, I am very fine if you decide to not to include the WebGPU ep in the WASM example at the moment. I am also happy if you take over this pull request now and make any desired changes you want, i.e., remove the WebGPU ep code etc! |
|
Hmm, those |
|
In another context I just realized that the YoloV8 model you provide has now a different URL: https://cdn.pyke.io/0/pyke:ort-rs/[email protected]/yolov8m.onnx Can you fix that link in the WASM example code? |
|
I don't think I can push to your branch because it's an organization-owned fork, but I updated the model URL and did another minor tweak to fix the name of the output .js file and it's all working! Next is WebGPU =) EDIT: yeah, no WebGPU for now. It links fine and acknowledges the EP but complains of a Rust panic during the call to EDIT 2: seems like the WebGPU build is just broken in general as removing the registration makes it panic at a different point. I still don't know where it would be panicking with an empty message. This type of weirdness only ever happens on WASM 🙃 |
This might be the pull request to track about the status of the WebGPU ep in WASM: microsoft/onnxruntime#23697 I am using WebGPU ep on macOS already successfully btw. :) |
|
Opened #363 with my changes, thank you for all your hard work here! |
Co-authored-by: r.menges <[email protected]>
According to @fs-eire, the WebGPU ep should work now in WASM: microsoft/onnxruntime#23072 (comment) |
Co-authored-by: r.menges <[email protected]>
|
WebGPU ep via WASM now replaces the JSEP (JavaScript Execution Provider) in the onnxruntime for Web: microsoft/onnxruntime@2c041e0 I guess the WebGPU ep should work now in WASM? I have no time at the moment to play around with that, sadly. |
|
@raphaelmenges I am working on WebGPU support with |
|
Sounds promising, cool! Do you still use Emscripten to build onnxruntime itself then? |
|
Thanks for all the work on this, guys. I'm trying to reproduce it and build the wasm example. I needed to first revert my branch to the commit that checked it it, otherwise I'd be hit with some tls requirements, and then tried to build with the webgpu feature on ubuntu 24 and hit the same issue: |
|
I've also tried to change it to use the |
|
@andrenatal Regarding the last error, you just need to add the |
|
Thanks @decahedron1 ! You mean add ndarray feature, to the ort-tract dependency? |
|
Add it to the [dependencies.ort]
version = "=2.0.0-rc.10"
default-features = false
features = [
+ "ndarray",
"alternative-backend"
] |
|
Thank you @decahedron1 . I'll try to reproduce and build the wasm example, using both tract and onnxruntime (with and without webgpu). I'll use osx arm64 as the builder platform and if I hit roadblocks will switch to ubuntu 24 x86. So I first tried tract on osx and it built after I added the ndarray feature back, but I'm getting the following panic when testing on chrome: which source is here: My changes were this: https://github.com/andrenatal/ort/pull/1/files |
|
Let's move this to a new issue/discussion, I don't want to flood the other participants inboxes =) |

Hello 👋,
I noticed that support for the WASM (WASI?) target in this crate has been officially dropped. However, I recently encountered a use case that requires me running
onnxruntimeon the Web. This led me to investigate whetherortcould work within an Emscripten environment.I discovered that this crate can be used as-is with the
wasm32-unknown-emscriptenRust compilation target! While the setup can get a bit complex—especially when enabling multi-threading inonnxruntime—it works well. I would love to see this example merged into the official repository, but I also understand if it is considered too experimental to be officially endorsed.Here’s a
.giffor motivation, showcasing my example using the YoloV8 model to classify objects in pictures on Chromium:The Less Crazy Part
Microsoft does not provide precompiled static libraries for WASM, so I created a GitHub Actions workflow to handle this. The generated
libonnxruntime.acan be linked withortas usual—even when targetingwasm32-unknown-emscripten.To expose Rust functions to JavaScript, the Rust main file must provide a C interface, which Emscripten can export. I use
rust-embedto bundle the.onnxmodel into the.wasm. Anindex.htmlfile then incorporates the.jsand.wasmoutputs, which include the compiled Rust code,onnxruntime, and the model.Additionally, the Emscripten SDK version used by the Rust compiler must match the exact version used by
onnxruntimefor successful linking.The Crazy Part
Things get trickier when enabling multi-threading in
onnxruntime. Since v1.19.0, Microsoft recommends enabling multi-threading using the--enable_wasm_threadsbuild flag. This linkslibonnxruntime.atopthread, meaning all linked objects—including Rust’s standard library—must also be compiled withpthreadsupport.However, Rust’s standard library is not compiled this way by default, so you must switch to Rust nightly and compile the Rust standard library with
+atomics,+bulk-memory,+mutable-globals.Additionally, the server must set specific CORS flags, as multi-threaded Emscripten uses
SharedArrayBufferin the Web browser, which requires these settings.Verdict
I am already opening the pull request as a draft to get early feedback. However, at least following ToDos are pending before a pull request could happen:
ort's precompiled staticlibonnxruntimemechanism.In the future, I would love to see execution providers for the Web to be available in
ort:Might be the best option for now?Seems to exposed to the JavaScript world, only.