-
-
Notifications
You must be signed in to change notification settings - Fork 362
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Breaking changes in model output layers format #587
Comments
The TensorRT sometimes doesn't keep the output order, making an issue with the parser function. That's why it's using 1 output now instead of 3. |
You can change the |
I think that's the same issue identified in this DeepStream forum topic when running your custom parser library with If that's the only reason for the change, I kindly ask you to consider preserving the previous format, and adapt Your great work in this repository is widely spread (example), and I believe besides me there might be several folks using ONNX models converted to the previous output format. IMHO the impact of this change is huge because it requires running the export scripts again using the original weights files. Thank you |
For paddle (PP-YOLOE, RT-DETR, etc) models, I can't set easily the output names for the ONNX model, so I can't get the layers by the name. |
I think the previous
|
Also here it seems to be missing
|
|
@marcoslucianops I was able to workaround by using
Deployed it on DeepStream 7.0 with the previous version of |
I think the best option is to keep one output layer for the models. The TRT doesn't guarantee the output order when converting from ONNX. To keep the 3 outputs, it's possible to just use the old |
Can the above be applied to deepstream-app as I am really struggling? |
Hi Marcos,
Can you please explain the motivation for changing the models output format in this commit?
This breaks the compatibility with all the models previously converted to ONNX (currently in production) that are still using the old format right?
Thank you
The text was updated successfully, but these errors were encountered: