Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to export an ONNX with opset version = 13? #459

Closed
joyyang1215 opened this issue Jun 12, 2024 · 8 comments
Closed

How to export an ONNX with opset version = 13? #459

joyyang1215 opened this issue Jun 12, 2024 · 8 comments
Assignees
Labels
help wanted Extra attention is needed

Comments

@joyyang1215
Copy link

How to export an ONNX with opset version = 13? Currently, the silero_vad.onnx is opset version = 16.
Could you tell me how to get other opset version of the ONNX model?
Thanks

@joyyang1215 joyyang1215 added the help wanted Extra attention is needed label Jun 12, 2024
@snakers4
Copy link
Owner

@joyyang1215

The new VAD version was released just now - #2 (comment).
It was also exported with opset 16.
Can you provide some reasoning behind your choice of the opset?
Maybe we can export alternative models with several opsets, if there is some inner logic in the opset choice.

@snakers4
Copy link
Owner

snakers4 commented Jul 1, 2024

While the new release is fresh, we are able to export with certain other opsets.
If this is necessary, please provide the necessary opsets and some reasoning.

@snakers4
Copy link
Owner

If this becomes relevant, please open a new issue.

@letranhuy2612
Copy link

I would like to downgrade to opset versions lower than 15 because, when deploying the model to Triton, the current Kubernetes setup only supports up to opset 15.This is the error I receive.Thanks

| silero_vad | 1 | UNAVAILABLE: Internal: onnx runtime error 1: Load model from /tmp/folderH4ekFf/1/model.onnx failed:/workspace/onnxruntime/onnxruntime/core/graph/model_load_utils.h:47 void onnxruntime::model_load_utils::ValidateOpsetForDomain(const std::unordered_map<std::__cxx11::basic_string<char>, int>&, const onnxruntime::logging::Logger&, bool, const string&, int) ONNX Runtime only *guarantees* support for models stamped with official released onnx opset versions. Opset 16 is |
| | | under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain ai.onnx is till opset 15. |

@snakers4
Copy link
Owner

snakers4 commented Nov 1, 2024

15 for Kubernetes and Triton.
13 for - ?
Are there any other opsets necessary?

@letranhuy2612
Copy link

I would like to downgrade to opset versions lower than 15 not necessarily 13.Thanks

@letranhuy2612
Copy link

letranhuy2612 commented Nov 12, 2024

Could you tell me how to get other opset version of the ONNX model @snakers4 ? Thanks

@snakers4
Copy link
Owner

#573

We added a 16 kHz only ONNX model that was exported with opset 15.
Turns out the tool we were using to package all of the models was internally overriding the opset value and exporting with 16.
Opset 16 has support of if-statements, unlike earlier ones.
If this model would not work for you, please create another issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

3 participants