We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I have export the onnx model of yolov7 ,and try to convert the model on the platform of jetpack4.5.1 with deepstream5.1, but got an error as below:
It seem the split ops can not be used on this platform??
and the same onnx can be converted successfully at the following platform:
The text was updated successfully, but these errors were encountered:
You can try providing onnx and engine files to config. If the engine file doesn't exist it will be created automatically from onnx file
onnx-file=best.onnx model-engine-file=best.onnx_b1_gpu0_fp16.engine
Sorry, something went wrong.
You can try providing onnx and engine files to config. If the engine file doesn't exist it will be created automatically from onnx file onnx-file=best.onnx model-engine-file=best.onnx_b1_gpu0_fp16.engine
Already set. But still have same error caused. It seems verson of onnxparser plugin do not suit for the onnx file.
No branches or pull requests
I have export the onnx model of yolov7 ,and try to convert the model on the platform of jetpack4.5.1 with deepstream5.1, but got an error as below:
It seem the split ops can not be used on this platform??
and the same onnx can be converted successfully at the following platform:
The text was updated successfully, but these errors were encountered: