-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add onnx export #2596
Add onnx export #2596
Conversation
6c6e233
to
19f9a44
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we also add a test on inference?
Do you mean using the ONNX Runtime? |
yes, or another way how to test that it really does something - accept given array as input and returns an output in expected range |
Cool. I'll add tests based on the examples mentioned here: https://pytorch.org/tutorials/advanced/super_resolution_with_onnxruntime.html |
we should add a test, that the outputs of the exported model match the outputs of the original one :) |
Hello @lezwon! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2020-07-31 08:02:08 UTC |
I've added that one in |
@Borda I need some help with the tests. It's breaking on ubuntu-20.04 because onnx is not installed on it. Seems to be working on others. |
The problem is not Ubuntu, but using Conda, you have added the dependency to pip env, so you need to add it also here: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks great! Easy export to ONNX is a nice feature and I really appreciate all the tests you've written to ensure the feature remains stable :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, just one minor note :)
6249630
to
747f079
Compare
# if you specify an example input, the summary will show input/output for each layer | ||
# TODO: to be fixed in #1773 | ||
# self.example_input_array = torch.rand(5, 28 * 28) | ||
self.example_input_array = torch.rand(5, 28 * 28) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought we were talking about having it as property, right? @awaelchli
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you mean the user overrides the property function?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, at least we talked about it but maybe there was some picking issue?
Sure. Will do that :] |
eb9b6e4
to
9765c31
Compare
add to changelog
Co-authored-by: Jirka Borovec <[email protected]>
Co-authored-by: Jirka Borovec <[email protected]>
9765c31
to
add43b7
Compare
What does this PR do?
Adds functionality to quickly export your model to ONNX format.
Fixes #2271
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃