Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Multi-precision support for Adam optimizer #13753

Closed
eric-haibin-lin opened this issue Jan 1, 2019 · 3 comments
Closed

Multi-precision support for Adam optimizer #13753

eric-haibin-lin opened this issue Jan 1, 2019 · 3 comments

Comments

@eric-haibin-lin
Copy link
Member

As title. Currently Adam optimizer doesn't support mixed-precision training.

@mxnet-label-bot
Copy link
Contributor

Hey, this is the MXNet Label Bot.
Thank you for submitting the issue! I will try and suggest some labels so that the appropriate MXNet community members can help resolve it.
Here are my recommended labels:

@anirudhacharya
Copy link
Member

anirudhacharya commented Mar 19, 2019

@eric-haibin-lin will the multi-precision update function need to support row_sparse storage type and lazy_update of weight arrays?

@eric-haibin-lin
Copy link
Member Author

No. This does not seem to be a common case for the adamw optimizer

closed with #14171

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

3 participants