You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was going through the training code in experiments/roma_indoor.py and it seems that you have used a non-distributed sampler (WeightedRandomSampler) instead of DistributedSampler. I believe this means that the entire data will be replicated and passed to each model replica instead of shards of the data. Just wanted to confirm this and ask if this is this is intentional?
Thanks!
The text was updated successfully, but these errors were encountered:
Yes all data goes to all. Its not really intentional. Distributed sampler should be used but I think the effect would be minor. I guess if youre sending entire scannet to each gpu it will be annoyingly heavy. Btw, I just used every 10th frame of scannet iirc (it was a while since I used the code)
Thanks! I am not sure what happens when a non-distributed sampler works with distributed training. I am assuming each replica of the dataloader gets a different seed and the order of samples is different across devices?
Hi,
I was going through the training code in
experiments/roma_indoor.py
and it seems that you have used a non-distributed sampler (WeightedRandomSampler
) instead ofDistributedSampler
. I believe this means that the entire data will be replicated and passed to each model replica instead of shards of the data. Just wanted to confirm this and ask if this is this is intentional?Thanks!
The text was updated successfully, but these errors were encountered: