-
Notifications
You must be signed in to change notification settings - Fork 6.8k
MXNet to ONNX export tutorial #12297
MXNet to ONNX export tutorial #12297
Conversation
@@ -124,6 +124,9 @@ def test_nlp_cnn(): | |||
def test_onnx_super_resolution(): | |||
assert _test_tutorial_nb('onnx/super_resolution') | |||
|
|||
def test_onnx_export_mxnet_to_onnx(): | |||
assert _test_tutorial_nb('onnx/export_mxnet_to_onnx') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
these tests are currently disabled from running on the nightly test run.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fails CI build if new test not added in test_tutorials. If this is currently disabled, do we need it as a part of CI lint check? @marcoabreu
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes
checker.check_graph(model_proto.graph) | ||
``` | ||
|
||
This method confirms exported model protobuf is valid. Now, the model is ready to be imported in other frameworks for inference! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it verifies that the proto file is as per the ONNX specification, but do you think we should perform inference on the serialized model on any of the frameworks to ensure that the model parameters and model structure was serialized properly?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Definitely good to have! We can extend this tutorial later to show inference with other frameworks or add another tutorial to show end to end usecase.
converted_model_path = onnx_mxnet.export_model(sym, params, [input_shape], np.float32, onnx_file) | ||
``` | ||
|
||
This API returns path of the converted model which you can use further to import in other frameworks. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
.. which you can later use to import the model into other frameworks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
|
||
[Open Neural Network Exchange (ONNX)](https://github.com/onnx/onnx) provides an open source format for AI models. It defines an extensible computation graph model, as well as definitions of built-in operators and standard data types. | ||
|
||
In this tutorial, we will show how you can save MXNet models to ONNX format. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the ONNX format.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
|
||
In this tutorial, we will show how you can save MXNet models to ONNX format. | ||
|
||
Current MXNet-ONNX import and export operator support and coverage can be found [here](https://cwiki.apache.org/confluence/display/MXNET/ONNX+Operator+Coverage). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
operators (plural)
Also, try linking to more context than (here). Say like:
MXNet-ONNX operators coverage and features are updated regularly. Visit the ONNX operator coverage page for the latest information.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
|
||
In this tutorial we will: | ||
|
||
- learn how to use MXNet to ONNX exporter on pretrained models |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how about one sentence? Unless you're thinking of adding more bullets....
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
|
||
- learn how to use MXNet to ONNX exporter on pretrained models | ||
|
||
## Pre-requisite |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Prerequisites
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
To run the tutorial you will need to have installed the following python modules: | ||
- [MXNet == 1.3.0](http://mxnet.incubator.apache.org/install/index.html) | ||
- [onnx](https://github.com/onnx/onnx) v1.2.1 (follow the install guide) | ||
MXNet ONNX importer and exporter follows version 7 of ONNX operator set. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure what this means. I see you want ONNX v1.2.1, but there's yet another version to know about within ONNX? Would I get v1.2.1 then have to download an operator set that is version 7? Maybe this is TMI?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
v1.2.1 comes with opset 7 which we currently support. Made it more clear
|
||
This is useful if we are training a model. At the end of training, we just need to invoke the `export_model` function and provide sym and params objects as inputs with other attributes to save the model in ONNX format. | ||
|
||
#### MXNet's exported json and params files: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd number this 2
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
|
||
This is useful if we have pre-trained models and we want to convert them to ONNX format. | ||
|
||
In this tutorial, we will show second usecase to convert pretrained model to ONNX format: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use case
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
|
||
```python | ||
# Downloaded input symbol and params files | ||
sym = 'resnet-18-symbol.json' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
At this point I'm thinking I was supposed to manually download the files from the zoo. Is that right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we downloaded files before. we will use those now
|
||
## Check validity of ONNX model | ||
|
||
Now we can check validity of the converted ONNX model by using ONNX checker tool as follows: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
...by using ONNX checker tool. The tool will validate the model by checking if the content contains valid protobuf:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
checker.check_graph(model_proto.graph) | ||
``` | ||
|
||
This method confirms exported model protobuf is valid. Now, the model is ready to be imported in other frameworks for inference! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any comment on what to do if the check fails or common reasons why it might fail?
Also, is there a call to action or next steps?
@aaronmarkham Addressed the comments. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just a few more updates needed.
|
||
## Pre-requisite | ||
## Prerequisite |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Make plural
From the above API description, we can see that export_model API accepts 2 kinds of inputs: | ||
|
||
#### MXNet sym, params objects: | ||
From the above API description, we can see that export_model API accepts two kinds of inputs: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
that the export_model
@@ -95,6 +95,8 @@ model_proto = onnx.load(converted_model_path) | |||
checker.check_graph(model_proto.graph) | |||
``` | |||
|
|||
If the converted protobuf format doesn't qualify to ONNX proto specifications, the checker will throw errors but in this case it successfully passes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
errors, but
6100d0e
to
798e273
Compare
@aaronmarkham Thanks for reviewing. Addressed new comments :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this tutorial @Roshrini !
|
||
MXNet-ONNX operators coverage and features are updated regularly. Visit the [ONNX operator coverage](https://cwiki.apache.org/confluence/display/MXNET/ONNX+Operator+Coverage) page for the latest information. | ||
|
||
In this tutorial we will learn how to use MXNet to ONNX exporter on pre-trained models. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: In this tutorial, we will..
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
making hyper link to API spec for mxnet-to-onnx exporter will be useful.
## Prerequisite | ||
|
||
To run the tutorial you will need to have installed the following python modules: | ||
- [MXNet == 1.3.0](http://mxnet.incubator.apache.org/install/index.html) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
>=
to make this tutorial relevant going forward ?
|
||
To run the tutorial you will need to have installed the following python modules: | ||
- [MXNet == 1.3.0](http://mxnet.incubator.apache.org/install/index.html) | ||
- [onnx](https://github.com/onnx/onnx) v1.2.1 (follow the install guide) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please link to install guide. May be this? https://github.com/onnx/onnx#installation
|
||
## Downloading a model from the MXNet model zoo | ||
|
||
We download a pre-trained model, in this case ResNet-18 model, trained on [ImageNet](http://www.image-net.org/) from the [MXNet Model Zoo](http://data.mxnet.io/models/imagenet/). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We download the pre-trained ResNet-18 ImageNet model from MXNet Model Zoo
|
||
## MXNet to ONNX exporter API | ||
|
||
We can check MXNet's ONNX `export_model` API usage as follows: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let us describe the MXNet's export_model
to ONNX API.
```python | ||
help(onnx_mxnet.export_model) | ||
``` | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you want to show the output here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
help(onnx_mxnet.export_model) | ||
``` | ||
|
||
From the above API description, we can see that export_model API accepts two kinds of inputs: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
export_model API can accept the MXNet model in one of the following two ways.
2. MXNet's exported json and params files: | ||
* This is useful if we have pre-trained models and we want to convert them to ONNX format. | ||
|
||
In this tutorial, we will show second use case to convert pre-trained model to ONNX format: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since we have downloaded a pre-trained model files, we will use the export_model API by passing the path for symbol and params files.
|
||
## How to use MXNet to ONNX exporter API | ||
|
||
We will use downloaded files and define input variables. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We will use the downloaded pre-trained model files (sym, params)...
|
||
```python | ||
# Downloaded input symbol and params files | ||
sym = 'resnet-18-symbol.json' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will be more clear that this is file path if we use - './resnet-18-symbol.json' etc. Thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it will be more explicit. Changed it.
798e273
to
01734dc
Compare
@sandeep-krishnamurthy Thanks for reviewing. Addressed comments. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
LGTM
* mxnet to onnx export tutorial added * test added * addressing review comment * comments addressed * few more fixes * addressing comments * addressing comments * retrigger build
* mxnet to onnx export tutorial added * test added * addressing review comment * comments addressed * few more fixes * addressing comments * addressing comments * retrigger build
* mxnet to onnx export tutorial added * test added * addressing review comment * comments addressed * few more fixes * addressing comments * addressing comments * retrigger build
Description
Example showing how to use MXNet to ONNX exporter
@aaronmarkham
Checklist
Essentials