Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

inference output error #796

Open
xiaotongnii opened this issue Jan 6, 2025 · 2 comments
Open

inference output error #796

xiaotongnii opened this issue Jan 6, 2025 · 2 comments

Comments

@xiaotongnii
Copy link

xiaotongnii commented Jan 6, 2025

version:v22.02
evn:android
platform:armv8.2 cpu
model: onnx2armnn
input: the same data
diff:armnn inference outtput has a big error and onnxruntime inference result correctly.

armnn inference output has error compard with onnxruntime using onnx model.
how to debug every layer output to find the error.

@Colm-in-Arm
Copy link
Collaborator

Hello.

There are a pair of Optimizer options to print intermediate outputs to file:

OptimizerOptionsOpaque.SetDebugEnabled(bool) and OptimizerOptionsOpaque.SetDebugToFileEnabled(bool). How you set these will depend on how you are executing inferences. If you are using ExecuteNetwork then the -p parameter will do it.

Please note that our Onnx parser hasn't had much attention in many years and there are many constructs that it does not support. The choice of backend may also effect the results. For example, using CpuRef with int8 data often leads to differences in inference results.

Colm.

@xiaotongnii
Copy link
Author

Hi,Colm:
thanks for your relay, I try to debug where lead in error using OptimizerOptionsOpaque you advice.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants