Skip to content

Conversation

@KJlaccHoeUM9l
Copy link
Contributor

Earlier I added converter for QAttention (see PR#13654).
In this PR, I've added support for past input and unidirectional attribute in Attention. I've also moved the common code for QAttention and Attention into a base class.

@tvm-bot
Copy link
Collaborator

tvm-bot commented Jan 17, 2023

Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.

Generated by tvm-bot

Copy link
Contributor

@echuraev echuraev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@KJlaccHoeUM9l
Copy link
Contributor Author

@tvm-bot rerun

@KJlaccHoeUM9l KJlaccHoeUM9l force-pushed the agladyshev/dev/attention branch from 4a8b581 to 8e7462f Compare January 19, 2023 08:42
Copy link
Contributor

@vvchernov vvchernov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. I belive that somewhere problems with magic number -10000 from ONNX Runtime will be resolved

@KJlaccHoeUM9l
Copy link
Contributor Author

Hello @AndrewZhaoLuo!
Looks like we can merge this PR?

@AndrewZhaoLuo AndrewZhaoLuo merged commit cc352a4 into apache:main Jan 19, 2023
fzi-peccia pushed a commit to fzi-peccia/tvm that referenced this pull request Mar 27, 2023
…rib opset (apache#13797)

* add type & shape checking

* add base class for Attention converter

* add support for 'past' input

* add support for 'unidirectional' attribute

* fix for 'huggingface implementation'

* add common method for calculating Attention

* expand test coverage for Attention
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants