-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ai.onnxruntime.OrtException: This tensor is not representable in Java, it's too big #7270
Comments
Well that's a misleading error message, as the check is for shape validation, not just that it's too big. That should be fixed. However the shape is reported as being |
I don't know why the output differs between pytorch & ORT. Maybe someone who has a better understanding of the internals could take a look. cc @pranavsharma |
Issue Resolved!The Exception
Raise when it won't contain any elements! export_onnx.py (Python Source):The output differ while the shape was fixed! To resolve that, export the model with dynamic size
Prediction via Onnx runtime after dynamic size in python:PS:torchvision.transforms.functional.to_tensor, it return normalised value.while using the non normalised value!The same output came in JAVA 👍Have any conclusion @Craigacp @pranavsharma |
Good, I'm glad you've figured it out. I'll update the error message at some point in the next couple of weeks. |
making the shape [1,4] might work |
You ended 12 hours of search, thanks @anandvsr, setting a dynamic input fixes the issue |
) ### Description Allows the creation of zero length tensors via the buffer path (the array path with zero length arrays still throws as the validation logic to check it's not ragged would require more intrusive revision), and allows the `tensor.getValue()` method to return a Java multidimensional array with a zero dimension. Also added a test for the creation and extraction behaviour. ### Motivation and Context The Python interface can return zero length tensors (e.g. if object detection doesn't find any objects), and before this PR in Java calling `tensor.getValue()` throws an exception with a confusing error message. Fixes #7270 & #15107.
get
This tensor is not representable in Java, it's too big
, while inference fasterrcnn_resnet50_fpn model in onnx.export_onnx.py (Python Source):
build.gradle (Gradle Source):
Pridiction.java (Java Source):
Exception while inference:
Refered sites:
The text was updated successfully, but these errors were encountered: