Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

FP16 support for topK #14125

Closed
eric-haibin-lin opened this issue Feb 12, 2019 · 3 comments · Fixed by #15560
Closed

FP16 support for topK #14125

eric-haibin-lin opened this issue Feb 12, 2019 · 3 comments · Fixed by #15560

Comments

@eric-haibin-lin
Copy link
Member

TopK doesn't support fp16 inputs.

@mxnet-label-bot
Copy link
Contributor

Hey, this is the MXNet Label Bot.
Thank you for submitting the issue! I will try and suggest some labels so that the appropriate MXNet community members can help resolve it.
Here are my recommended labels: Feature

@stephenrawls
Copy link
Contributor

+1

This is annoying for us because we need to use topk in our CRF Viterbi decoding during inference (for k-best Viterbi).

Because topk doesn't support float16 this means we have to convert from float16 to float32 before doing k-best Viterbi decoding. Since this is the most time consuming part of inference, we would prefer to be able to benefit from float16 operations here.

@eric-haibin-lin
Copy link
Member Author

related issues: #12705 #11156 #11061

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
3 participants