-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About Mutil-Gpus for training? #26
Comments
Hi! The code is expected to support multi-gpu training now with DataParallel. |
I train the model with your code and your official command, i.e., python core/train.py --config configs/celeba-hq.yaml --gpus 0,1. |
A link for this issue is in here . Can you solve this problem in your spare time? @imlixinyang |
Try command "python core/train.py --config configs/celeba-hq.yaml --gpus 0 1". |
I will try it now. Thanks for your reply. I am doing a new work based on your novel work. I will cite your work. |
It works now. Thanks again. @imlixinyang |
Glad to hear that! Gook luck for your research. |
Hi, authors:
I found an issue when training the code with multi-GPUs, which is that only a single GPU is used despite inputting multiple GPUs. Can you solve this problem in your spare time?
Thanks!
The text was updated successfully, but these errors were encountered: