You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
and set sess_options.register_custom_ops_library(get_library_path()), but error happened when running onnxruntime. quantization. quantize. quantize_static interface, report:
onnx.onnx_cpp2py_export.shape_inference.InferenceError: [TypeInferenceError] Cannot infer type and shape for node name 453. No opset import for domain ai.onnx.contrib optype LocatePostNms
where 453 is the name of custom op, how do i infer type and shape for this type of onnx model to finish the quantization?
Sorry that i can't provide the model
Urgency
No response
Target platform
windows 11 PRO/ ubuntu 22.04
Build script
python
Error / output
onnx.onnx_cpp2py_export.shape_inference.InferenceError: [TypeInferenceError] Cannot infer type and shape for node name 453. No opset import for domain ai.onnx.contrib optype LocatePostNms
Visual Studio Version
No response
GCC / Compiler Version
No response
The text was updated successfully, but these errors were encountered:
Describe the issue
I use onnxruntime python api to quantize a model with custom operator.
@onnx_op was used in my code to register the custom op:
@onnx_op(op_type='LocatePostNms',
domain='ai.onnx.contrib',
inputs=[PyCustomOpDef.dt_float, PyCustomOpDef.dt_int32, PyCustomOpDef.dt_float, PyCustomOpDef.dt_float,
PyCustomOpDef.dt_int32, PyCustomOpDef.dt_float],
outputs=[PyCustomOpDef.dt_float, PyCustomOpDef.dt_int32],
attrs={"classNumber": PyCustomOpDef.dt_int32,
"iou_thresh": PyCustomOpDef.dt_float,
"scale": PyCustomOpDef.dt_double,
"side": PyCustomOpDef.dt_int32,
"topk": PyCustomOpDef.dt_int32,
"type": PyCustomOpDef.dt_int32}
)
def locatepostnms_compute(x, **kwargs):
padded_box = np.ones((1, 24000, 7))
result_cnt = np.ones(1)
return padded_box, result_cnt
and set sess_options.register_custom_ops_library(get_library_path()), but error happened when running onnxruntime. quantization. quantize. quantize_static interface, report:
onnx.onnx_cpp2py_export.shape_inference.InferenceError: [TypeInferenceError] Cannot infer type and shape for node name 453. No opset import for domain ai.onnx.contrib optype LocatePostNms
where 453 is the name of custom op, how do i infer type and shape for this type of onnx model to finish the quantization?
Sorry that i can't provide the model
Urgency
No response
Target platform
windows 11 PRO/ ubuntu 22.04
Build script
python
Error / output
onnx.onnx_cpp2py_export.shape_inference.InferenceError: [TypeInferenceError] Cannot infer type and shape for node name 453. No opset import for domain ai.onnx.contrib optype LocatePostNms
Visual Studio Version
No response
GCC / Compiler Version
No response
The text was updated successfully, but these errors were encountered: