Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QNN] Refactor fixed point multiplication in requantize #4073

Merged
merged 1 commit into from
Oct 8, 2019

Conversation

vinx13
Copy link
Member

@vinx13 vinx13 commented Oct 7, 2019

This PR extracts fixed point multiplication in qnn.requantize into util.cc so that it can be used in other places like relay::quantize::QuantizeRealize

Please review @anijain2305 @zhiics

src/relay/qnn/util.cc Outdated Show resolved Hide resolved
src/relay/qnn/util.h Outdated Show resolved Hide resolved
@anijain2305
Copy link
Contributor

Another comment - Currently, QNN includes headers (code) from Relay. I tried to make it as uni-directional as possible. However, this will create bi-directional code inclusion (which might be ok). Therefore, it is worth thinking if we should put FixedPointMultiply in some header in Relay, which QNN can now read from.

@zhiics
Copy link
Member

zhiics commented Oct 7, 2019

LGTM. Or we can just add a util.h header to put the declarations.

@vinx13
Copy link
Member Author

vinx13 commented Oct 8, 2019

@zhiics Can you approve? Or shall we move FixedPointMul to relay/quantize/util.cc ?

Copy link
Member

@zhiics zhiics left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is okay. Let's merge it.

@zhiics zhiics merged commit 425430d into apache:master Oct 8, 2019
anijain2305 pushed a commit to anijain2305/tvm that referenced this pull request Oct 17, 2019
wweic pushed a commit to neo-ai/tvm that referenced this pull request Oct 18, 2019
petrex added a commit to petrex/tvm that referenced this pull request Oct 29, 2019
* master: (21 commits)
  [Fix][VM] Fix VM invoke with set_params (apache#4079)
  [QNN] Refactor fixed point multiplication in requantize (apache#4073)
  Fix match case in Python-side expr functor (apache#4037)
  Hide symbols from dependent libraries if HIDE_PRIVATE_SYMBOLS is ON. (apache#4041)
  Add gradient for log-softmax (apache#4069)
  [DOC] Fix typos in tutorials (apache#4066)
  dicrease the complexity of CalcDep from exponential to linear (apache#4053)
  [Relay][AlterOp] Minor refactor. (apache#4064)
  [Relay][AlterOp] Improving support for broadcast layout alteration. (apache#4040)
  Add parses support for zeros_like tflite operator (apache#4042)
  [Bugfix][TF] reset graph after getting tag of savedmodel (apache#4055)
  [Relay][VM] Add more passes to VMCompiler (apache#4058)
  [Relay][VM] Add autotvm context when compile (apache#4062)
  [Bugfix] Fix target host for vm compiler (apache#4057)
  [Relay][Training] Add gradient for Crossentropy (apache#3925)
  [llvm] switch to use Align for llvm trunk (apache#4051)
  [Relay][TopHub] Add switch to disable TopHub download (apache#4015)
  [Relay][Op] Add instance norm op (apache#4004)
  [QNN][Relay] Calling Dialect passes from inside Relay Build API. (apache#3971)
  [RELAY/PASS] Fix the extent for the post_stmt in the loop partition (apache#3734)
  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants