multi-gpu inference during training #1025
Replies: 4 comments
-
I think you can get the wrapped dataparallel model with |
Beta Was this translation helpful? Give feedback.
-
+1 here, in testing it can only work with one gpu. I tried wrapping with DataParallel and LightningDataParallel but I get things like this:
Any ideas @williamFalcon |
Beta Was this translation helpful? Give feedback.
-
.test() uses whatever you set up (dp, ddp, etc). no need to do anything yourself. hard to tell what you need without code |
Beta Was this translation helpful? Give feedback.
-
Something is going wrong in my case, I've opened a new issue to track what's going on |
Beta Was this translation helpful? Give feedback.
-
❓ Questions and Help
What is your question?
During training, I need to run all the data through my model from time to time. Currently, I do this during the
on_batch_end
hook. But if I just call the model'sforward
function, it will only use one GPU. I tried to wrap the model into ann.Dataparallel
before inferencing, but that doesn't seem to work. Any idea what I can do?Beta Was this translation helpful? Give feedback.
All reactions