Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ai.onnxruntime.OrtException: This tensor is not representable in Java, it's too big #7270

Closed
anandvsr opened this issue Apr 7, 2021 · 7 comments · Fixed by #15116
Closed
Labels
api:Java issues related to the Java API

Comments

@anandvsr
Copy link

anandvsr commented Apr 7, 2021

get This tensor is not representable in Java, it's too big, while inference fasterrcnn_resnet50_fpn model in onnx.

export_onnx.py (Python Source):

    model = torchvision.models.detection.fasterrcnn_resnet50_fpn(pretrained=True)
    x = [torch.rand(3, 1970, 1080)]
    torch.onnx.export(model, x, "faster_rcnn.onnx", opset_version = 11)

build.gradle (Gradle Source):

    dependencies {
          implementation group: 'com.microsoft.onnxruntime', name: 'onnxruntime', version: '1.7.0'
    }

Pridiction.java (Java Source):

47| try(OrtSession session = env.createSession(modelPath.toFile().getAbsolutePath(), 
48|                    new OrtSession.SessionOptions())) {
49|
50|     OnnxTensor t1 = OnnxTensor.createTensor(env, sourceData, dimensions);
51|     OrtSession.Result result = session.run(Collections.singletonMap("image.1", t1));
52|
53|     for (int i = 0; i < result.size(); i++) {
54|         System.out.println(result.get(i).getValue());
55|     }
56|
57|     result.close();
58|     t1.close();
59| }

Exception while inference:

ai.onnxruntime.OrtException: This tensor is not representable in Java, it's too big - shape = [0, 4]
	at ai.onnxruntime.TensorInfo.makeCarrier(TensorInfo.java:171)
	at ai.onnxruntime.OnnxTensor.getValue(OnnxTensor.java:99)
	at onnx.predict.Pridiction.main(Pridiction.java:54)

Refered sites:

       1. https://www.onnxruntime.ai/docs/reference/api/java-api.html
       2. https://pytorch.org/vision/stable/models.html#faster-r-cnn
@hariharans29 hariharans29 added api:Java issues related to the Java API type:support labels Apr 7, 2021
@Craigacp
Copy link
Contributor

Craigacp commented Apr 7, 2021

Well that's a misleading error message, as the check is for shape validation, not just that it's too big. That should be fixed.

However the shape is reported as being [0,4]. I don't think that's valid because it won't contain any elements, and so some kind of exception is expected (or I suppose it could return an empty array). Does this image contain anything that's been detected? If so, what is returned by the Python ONNX Runtime API for that image?

@anandvsr
Copy link
Author

anandvsr commented Apr 8, 2021

Prediction via Onnx runtime in python:

Screenshot from 2021-04-08 17-55-55

Why the inference return empty array?
please help to clarify!

Prediction in PyTorch (Python source):
image

@Craigacp
Copy link
Contributor

Craigacp commented Apr 8, 2021

I don't know why the output differs between pytorch & ORT. Maybe someone who has a better understanding of the internals could take a look. cc @pranavsharma

@anandvsr
Copy link
Author

Issue Resolved!

The Exception

    ai.onnxruntime.OrtException: This tensor is not representable in Java, it's too big - shape = [0, 4]
	    at ai.onnxruntime.TensorInfo.makeCarrier(TensorInfo.java:171)
	    at ai.onnxruntime.OnnxTensor.getValue(OnnxTensor.java:99)
	    at onnx.predict.Pridiction.main(Pridiction.java:54)

Raise when it won't contain any elements!
Thank you @Craigacp

export_onnx.py (Python Source):

The output differ while the shape was fixed!

To resolve that, export the model with dynamic size

w, h = 1080, 1970
x = [torch.rand(3, w, h)]
onnx_file = resources/model/fasterrcnn_resnet50_fpn.onnx"
torch.onnx.export(model, x, onnx_file,
                  input_names=['input'],
                  output_names=['output'],
                  opset_version=11,
                  dynamic_axes={'input': {1: 'height', 2: 'width'}},
                  enable_onnx_checker=True)

Prediction via Onnx runtime after dynamic size in python:

image

PS:

torchvision.transforms.functional.to_tensor, it return normalised value.

image

while using the non normalised value!

image

The same output came in JAVA 👍

Have any conclusion @Craigacp @pranavsharma

@Craigacp
Copy link
Contributor

Good, I'm glad you've figured it out. I'll update the error message at some point in the next couple of weeks.

@Maaitrayo
Copy link

get This tensor is not representable in Java, it's too big, while inference fasterrcnn_resnet50_fpn model in onnx.

export_onnx.py (Python Source):

    model = torchvision.models.detection.fasterrcnn_resnet50_fpn(pretrained=True)
    x = [torch.rand(3, 1970, 1080)]
    torch.onnx.export(model, x, "faster_rcnn.onnx", opset_version = 11)

build.gradle (Gradle Source):

    dependencies {
          implementation group: 'com.microsoft.onnxruntime', name: 'onnxruntime', version: '1.7.0'
    }

Pridiction.java (Java Source):

47| try(OrtSession session = env.createSession(modelPath.toFile().getAbsolutePath(), 
48|                    new OrtSession.SessionOptions())) {
49|
50|     OnnxTensor t1 = OnnxTensor.createTensor(env, sourceData, dimensions);
51|     OrtSession.Result result = session.run(Collections.singletonMap("image.1", t1));
52|
53|     for (int i = 0; i < result.size(); i++) {
54|         System.out.println(result.get(i).getValue());
55|     }
56|
57|     result.close();
58|     t1.close();
59| }

Exception while inference:

ai.onnxruntime.OrtException: This tensor is not representable in Java, it's too big - shape = [0, 4]
	at ai.onnxruntime.TensorInfo.makeCarrier(TensorInfo.java:171)
	at ai.onnxruntime.OnnxTensor.getValue(OnnxTensor.java:99)
	at onnx.predict.Pridiction.main(Pridiction.java:54)

Refered sites:

       1. https://www.onnxruntime.ai/docs/reference/api/java-api.html
       2. https://pytorch.org/vision/stable/models.html#faster-r-cnn

making the shape [1,4] might work

@hiskuDN
Copy link

hiskuDN commented Feb 28, 2023

You ended 12 hours of search, thanks @anandvsr, setting a dynamic input fixes the issue

yuslepukhin pushed a commit that referenced this issue Apr 5, 2023
)

### Description
Allows the creation of zero length tensors via the buffer path (the
array path with zero length arrays still throws as the validation logic
to check it's not ragged would require more intrusive revision), and
allows the `tensor.getValue()` method to return a Java multidimensional
array with a zero dimension. Also added a test for the creation and
extraction behaviour.

### Motivation and Context
The Python interface can return zero length tensors (e.g. if object
detection doesn't find any objects), and before this PR in Java calling
`tensor.getValue()` throws an exception with a confusing error message.
Fixes #7270 & #15107.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api:Java issues related to the Java API
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants