Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Relay][Frontend][Onnx] If Operator Support #6730

Merged
merged 6 commits into from
Nov 6, 2020
Merged

Conversation

jwfromm
Copy link
Contributor

@jwfromm jwfromm commented Oct 21, 2020

This surprisingly small PR adds support for the If operator in the Onnx frontend. This turned out to be a pretty direct conversion to relay!

@jwfromm
Copy link
Contributor Author

jwfromm commented Oct 21, 2020

@mbrookhart @masahi @tmoreau89 @soiferj can you guys take a look at this PR?

Copy link
Contributor

@mbrookhart mbrookhart left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@tmoreau89 tmoreau89 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks @jwfromm !

@masahi
Copy link
Member

masahi commented Oct 23, 2020

@jwfromm please run CI again

@jwfromm jwfromm force-pushed the onnx_if branch 2 times, most recently from f1b154e to 65454b7 Compare October 26, 2020 22:52
@jwfromm
Copy link
Contributor Author

jwfromm commented Oct 26, 2020

I'm pretty confused why this is failing CI. It looks like on the GPU test, the onnxruntime produces the wrong value (getting a -2 instead of the correct value of 1) while TVM produces the proper result. It seems like on the CPU node, the test passes. I'm not able to replicate the error on my local GPU machine. Does anyone have any ideas (@masahi) why that might happen?

@masahi
Copy link
Member

masahi commented Oct 26, 2020

Maybe the old version of onnxruntime had a bug? ORT on CI is fairly old (v1.00) while the latest one is v1.5. You probably have a newer ORT locally.

We'll update our PyTorch version soon-ish, sometime next month #6594. Maybe we can also update ORT together if desired.

@jwfromm The new ONNX version v1.8 should also be out soon onnx/onnx#3072

@jwfromm
Copy link
Contributor Author

jwfromm commented Oct 26, 2020

Good point, I'll try to check out using v1.00. Thanks!

@masahi
Copy link
Member

masahi commented Oct 29, 2020

@jwfromm something is wrong with CI

@jwfromm
Copy link
Contributor Author

jwfromm commented Nov 4, 2020

Sorry for the delay. It seems like the only path forward on this for now is to disable GPU tests until onnxruntime is updated to a newer version.

@jwfromm
Copy link
Contributor Author

jwfromm commented Nov 5, 2020

@masahi, I ended up removing onnxruntime from the if test for now. Since the correct result in this case is very clear, there's not much need to generate with onnxruntime. Once we update to a newer ort version I'll switch back.

@masahi masahi merged commit 9ea4bf5 into apache:main Nov 6, 2020
@masahi
Copy link
Member

masahi commented Nov 6, 2020

thanks @jwfromm @mbrookhart @tmoreau89

trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Dec 2, 2020
* If operator support in ONNX.

* Small tweak.

* Added uses_gpu tag.

* Disable test on GPU until onnxruntime version is updated.

* Use parametrize_target to specify CPU only.

* Just dont use onnxruntime for now i guess.
trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Dec 4, 2020
* If operator support in ONNX.

* Small tweak.

* Added uses_gpu tag.

* Disable test on GPU until onnxruntime version is updated.

* Use parametrize_target to specify CPU only.

* Just dont use onnxruntime for now i guess.
trevor-m pushed a commit to neo-ai/tvm that referenced this pull request Dec 4, 2020
* If operator support in ONNX.

* Small tweak.

* Added uses_gpu tag.

* Disable test on GPU until onnxruntime version is updated.

* Use parametrize_target to specify CPU only.

* Just dont use onnxruntime for now i guess.
@jwfromm jwfromm deleted the onnx_if branch April 12, 2023 15:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants