Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix a bug of flatten in ONNX to Relay converter #3180

Merged
merged 4 commits into from
May 13, 2019

Conversation

Oldpan
Copy link
Contributor

@Oldpan Oldpan commented May 11, 2019

Batch_flatten, as it's name suggests, should flatten each data samples of a batch.
So we don't need to enhance batch_flatten to support another axis except batch_size axis.
As a result, we can do an optional conversion (convert to batch_flatten when the flatten op matches, use reshape otherwise).

@Oldpan
Copy link
Contributor Author

Oldpan commented May 11, 2019

Related pull request #2843

@jroesch
Copy link
Member

jroesch commented May 11, 2019

@zhreshold LGTM, thoughts?

@jroesch jroesch requested a review from zhreshold May 11, 2019 22:20
Copy link
Member

@tqchen tqchen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok modulo nit

python/tvm/relay/frontend/onnx.py Outdated Show resolved Hide resolved
@jroesch jroesch merged commit 25c91d3 into apache:master May 13, 2019
wweic pushed a commit to wweic/tvm that referenced this pull request May 13, 2019
* fix onnx frontend flatten bug

* Update onnx.py

* Update onnx.py

* Update onnx.py
wweic pushed a commit to neo-ai/tvm that referenced this pull request May 13, 2019
* fix onnx frontend flatten bug

* Update onnx.py

* Update onnx.py

* Update onnx.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants