You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[19:20:42] :93: These Op Not Support: ONNX::Conv | ONNX::DequantizeLinear | ONNX::QuantizeLinear
Converted Failed!
Traceback (most recent call last):
File "/root/miniconda3/bin/mnnconvert", line 8, in<module>sys.exit(main())
File "/root/miniconda3/lib/python3.8/site-packages/MNN/tools/mnnconvert.py", line 49, in main
dst_model_size = os.path.getsize(arg_dict["MNNModel"]) / 1024.0 / 1024.0
File "/root/miniconda3/lib/python3.8/genericpath.py", line 50, in getsize
return os.stat(filename).st_size
FileNotFoundError: [Errno 2] No such file or directory: 'qat_int8.mnn'
你好,我用torch.fx量化模型,然后将量化模型转换为onnx,遇到的报错如下:
下面是一个mv2的量化模型
qat_int8.zip
The text was updated successfully, but these errors were encountered: