-
Notifications
You must be signed in to change notification settings - Fork 3.7k
[Unity][Frontend] FX exp and strided_slice fix #14338
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Unity][Frontend] FX exp and strided_slice fix #14338
Conversation
|
Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.
Generated by tvm-bot |
1 similar comment
|
Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.
Generated by tvm-bot |
|
|
||
| graph_model = fx.symbolic_trace(torch_model) | ||
| mod = from_fx(graph_model, input_info) | ||
| print(mod.script()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
left over debug statement?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes sorry for that. Just removed it.
|
LGTM! just one left over print statement |
* Add the support of `exp` for the FX translator. * Previously the way FX translator dealt with `None` in torch tensor slice (e.g., `x[:, None, None]`) is not right. This PR fixes this issue. Specifically, the `None` here means dim expansion, and the previous impl mistakenly increases the dim counter when seeing `None`, which will lead to dim counter out-of-range issue in the end.
fae4a65 to
05ea131
Compare
* Add the support of `exp` for the FX translator. * Previously the way FX translator dealt with `None` in torch tensor slice (e.g., `x[:, None, None]`) is not right. This PR fixes this issue. Specifically, the `None` here means dim expansion, and the previous impl mistakenly increases the dim counter when seeing `None`, which will lead to dim counter out-of-range issue in the end.
* Add the support of `exp` for the FX translator. * Previously the way FX translator dealt with `None` in torch tensor slice (e.g., `x[:, None, None]`) is not right. This PR fixes this issue. Specifically, the `None` here means dim expansion, and the previous impl mistakenly increases the dim counter when seeing `None`, which will lead to dim counter out-of-range issue in the end.
* Add the support of `exp` for the FX translator. * Previously the way FX translator dealt with `None` in torch tensor slice (e.g., `x[:, None, None]`) is not right. This PR fixes this issue. Specifically, the `None` here means dim expansion, and the previous impl mistakenly increases the dim counter when seeing `None`, which will lead to dim counter out-of-range issue in the end.
* Add the support of `exp` for the FX translator. * Previously the way FX translator dealt with `None` in torch tensor slice (e.g., `x[:, None, None]`) is not right. This PR fixes this issue. Specifically, the `None` here means dim expansion, and the previous impl mistakenly increases the dim counter when seeing `None`, which will lead to dim counter out-of-range issue in the end.
expfor the FX translator.Nonein torch tensor slice (e.g.,x[:, None, None]) is not right. This PR fixes this issue. Specifically, theNonehere means dim expansion, and the previous impl mistakenly increases the dim counter when seeingNone, which will lead to dim counter out-of-range issue in the end.