This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 6.8k
gluon.SymbolBlock cannot imports resnet trained with dtype="float16" #11849
Comments
@sandeep-krishnamurthy Please help to label this issue Gluon |
Got the same problem here. Any updates here? |
@rahul003 same problem here, can't load back saved float16 models |
Using below snippet:
Below are my findings:
Below is the issue:
I am working on the fix. @apeforest @ThomasDelteil - FYI |
4 tasks
Resolving as changes are merged. |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Description
Cannot load fine-tuned resnet101 (incubator-mxnet/example/image-classification/symbols/resnet.py) with dtype="float16" with "gluon.SymbolBlock.imports" method.
Error Message:
AssertionError: Failed loading Parameter 'stage3_unit2_conv2_weight' from saved params: dtype incompatible expected <type 'numpy.float32'> vs saved <type 'numpy.float16'>
Minimum reproducible example
My Questions
In incubator-mxnet/example/image-classification/symbols/resnet.py,
there is mx.sym.Cast for type conversion.
I fine-tuned resnet101 with dtype="float16", and I need to load this model as HybridBlock, However, the method gluon.SymbolBlock.imports makes every params' type in the network as float32. Therefore, the trained model cannot be updated.
Here, resnet-101-0007.params are trained with argument dtype='float16'
In resnet-101-symbol.json file, there is the Cast op.
{
"op": "Cast",
"name": "cast0",
"attrs": {"dtype": "float16"},
"inputs": [[7, 0, 0]]
},
It seems that gluon.SymbolBlock.imports does not consider the type conversion operator.
For now, I think I need to load all parameter manualy, and change types then save.
Is there any other solution to solve this problem?
The text was updated successfully, but these errors were encountered: