Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add onnx export #2596

Merged
merged 25 commits into from
Jul 31, 2020
Merged

Add onnx export #2596

merged 25 commits into from
Jul 31, 2020

Conversation

lezwon
Copy link
Contributor

@lezwon lezwon commented Jul 13, 2020

What does this PR do?

Adds functionality to quickly export your model to ONNX format.

Fixes #2271

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together? Otherwise, we ask you to create a separate PR for every change.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@mergify mergify bot requested a review from a team July 13, 2020 10:47
@lezwon lezwon force-pushed the feature/2271_onnx_export branch from 6c6e233 to 19f9a44 Compare July 13, 2020 10:52
@Borda Borda added feature Is an improvement or enhancement Important labels Jul 13, 2020
@Borda Borda added this to the 0.9.0 milestone Jul 13, 2020
tests/base/model_template.py Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team July 13, 2020 11:16
Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we also add a test on inference?

@mergify mergify bot requested a review from a team July 13, 2020 11:18
@lezwon
Copy link
Contributor Author

lezwon commented Jul 13, 2020

can we also add a test on inference?

Do you mean using the ONNX Runtime?

@Borda
Copy link
Member

Borda commented Jul 13, 2020

can we also add a test on inference?

Do you mean using the ONNX Runtime?

yes, or another way how to test that it really does something - accept given array as input and returns an output in expected range

@lezwon
Copy link
Contributor Author

lezwon commented Jul 13, 2020

can we also add a test on inference?

Do you mean using the ONNX Runtime?

yes, or another way how to test that it really does something - accept given array as input and returns an output in expected range

Cool. I'll add tests based on the examples mentioned here: https://pytorch.org/tutorials/advanced/super_resolution_with_onnxruntime.html

@justusschock
Copy link
Member

we should add a test, that the outputs of the exported model match the outputs of the original one :)

@pep8speaks
Copy link

pep8speaks commented Jul 14, 2020

Hello @lezwon! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-07-31 08:02:08 UTC

@lezwon
Copy link
Contributor Author

lezwon commented Jul 14, 2020

we should add a test, that the outputs of the exported model match the outputs of the original one :)

I've added that one in test_if_inference_output_is_valid :]

@lezwon
Copy link
Contributor Author

lezwon commented Jul 14, 2020

@Borda I need some help with the tests. It's breaking on ubuntu-20.04 because onnx is not installed on it. Seems to be working on others.

@Borda
Copy link
Member

Borda commented Jul 14, 2020

Borda I need some help with the tests. It's breaking on ubuntu-20.04 because onnx is not installed on it. Seems to be working on others.

The problem is not Ubuntu, but using Conda, you have added the dependency to pip env, so you need to add it also here:
https://github.com/PyTorchLightning/pytorch-lightning/blob/master/environment.yml

Copy link
Contributor

@jeremyjordan jeremyjordan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks great! Easy export to ONNX is a nice feature and I really appreciate all the tests you've written to ensure the feature remains stable :)

@mergify mergify bot requested a review from a team July 15, 2020 02:27
@mergify mergify bot requested a review from a team July 15, 2020 03:25
@Borda Borda requested a review from awaelchli July 15, 2020 06:12
Copy link
Member

@justusschock justusschock left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, just one minor note :)

pytorch_lightning/core/lightning.py Show resolved Hide resolved
@lezwon lezwon force-pushed the feature/2271_onnx_export branch from 6249630 to 747f079 Compare July 16, 2020 04:52
@lezwon lezwon marked this pull request as ready for review July 19, 2020 18:41
CHANGELOG.md Outdated Show resolved Hide resolved
# if you specify an example input, the summary will show input/output for each layer
# TODO: to be fixed in #1773
# self.example_input_array = torch.rand(5, 28 * 28)
self.example_input_array = torch.rand(5, 28 * 28)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought we were talking about having it as property, right? @awaelchli

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you mean the user overrides the property function?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, at least we talked about it but maybe there was some picking issue?

@lezwon
Copy link
Contributor Author

lezwon commented Jul 29, 2020

@lezwon are there docs?
can you add a new section called production or something like that in common use cases.

Sure. Will do that :]

@lezwon lezwon force-pushed the feature/2271_onnx_export branch from eb9b6e4 to 9765c31 Compare July 29, 2020 19:01
@lezwon lezwon force-pushed the feature/2271_onnx_export branch from 9765c31 to add43b7 Compare July 29, 2020 19:03
CHANGELOG.md Outdated Show resolved Hide resolved
@Borda Borda merged commit b7afac3 into Lightning-AI:master Jul 31, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Could I convert lightning module to onnx? Thanks!
8 participants