-
Notifications
You must be signed in to change notification settings - Fork 138
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for onnx and onnx-tf #216
Comments
Thanks for the request. This package contains native components, so it would have to be built into a wheel file. If you'd like to try doing this yourself, follow the instructions here. And if you're successful, please make a pull request so we can add the package to the public repository. If anyone else wants this package too, let us know by clicking the thumbs-up button above. |
now I cannot find onnxruntime packages |
If you have an existing onnx model you want to use with Chaquopy, you can try the following:
|
Is support for onnx planned in near future? |
Not in the near future, sorry. But you can always try building it yourself as mentioned above. |
I have been trying to build the onnxruntime for days now, but I am getting nowhere. Their build script is aparently able to output wheel files, but I just can't get it to work. Is somebody here that is able to do that? I am not at all familiar with C and C++ build scripts and compilers, etc... |
Hello, |
Sorry, there's no update. But we do also support PyTorch (version 1.8.1) and TensorFlow Lite (version 2.5.0) – could you convert your model to one of those formats? |
Is support for onnx planned in near future? |
Sorry, the status is is still the same as in my previous comment. |
Hi @mhsmith, thanks for you guys great work for "Pythonizing" android, which I really like TBH. Btw, could I know, does pytorch or tflite inference with Chaquopy utilizes GPU? or is it CPU only? And is the model performance similar to native tflite in Flutter/Kotlin (ignore the postprocess) Thanks in advance |
We've made no attempt to enable GPU support for these Python packages, so they're probably CPU-only. This means they may have worse performance than the official Android tflite packages for Java/Kotlin, but how much worse will depend on your application. I tried an official tflite image classification demo for Android a few years ago, and I think the GPU mode was about twice as fast as the CPU mode. But things could have changed a lot since then. |
I need convert models from ONNX to Tensorflow. But I cannot find onnx packages. pls add it , thanks!
The text was updated successfully, but these errors were encountered: