How to train GAN with DDP without setting "find_unused_parameters=True" #18081
              
                Unanswered
              
          
                  
                    
                      function2-llx
                    
                  
                
                  asked this question in
                DDP / multi-GPU / multi-node
              
            Replies: 1 comment 3 replies
-
| 
         Well, I find that using fabric might be a good idea, since you can wrap generator and discriminator in DDP respectively.  | 
  
Beta Was this translation helpful? Give feedback.
                  
                    3 replies
                  
                
            
  
    Sign up for free
    to join this conversation on GitHub.
    Already have an account?
    Sign in to comment
  
        
    
Uh oh!
There was an error while loading. Please reload this page.
-
Hello everyone,
I am developing a GAN using Lightning and Distributed Data Parallel (DDP) for training. Due to the alternating training of the generator and discriminator in GANs, fulfilling DDP's requirement of all model parameters contributing to the loss calculation every forward pass presents a challenge.
While setting
find_unused_parameters=Truein the DDP configuration ensures successful execution, it compromises performance. Consequently, I want to know if it's possible to train GANs with DDP without setting find_unused_parameters=True.Any insights or recent developments related to this issue would be much appreciated. Thank you in advance!
Beta Was this translation helpful? Give feedback.
All reactions