You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am attempting to train a LERF model on a custom dataset containing approximately 2000 images. When I run the 'ns-train' command, my 32GB of RAM becomes fully utilized, resulting in the termination of the process. Previously, I have successfully trained standard NeRF models using NeRF Studio on the same dataset without any issues. Therefore, I am wondering whether 32GB of RAM is insufficient for training LERF models, or if I might be missing a specific command necessary for LERF training. I have also tried reducing the number of images in my dataset, but this approach did not resolve the issue. The command I am using for training is:
ns-train lerf --output-dir --data data/colmap/
Thank you so much for your help.
The text was updated successfully, but these errors were encountered:
Hi all,
I am attempting to train a LERF model on a custom dataset containing approximately 2000 images. When I run the 'ns-train' command, my 32GB of RAM becomes fully utilized, resulting in the termination of the process. Previously, I have successfully trained standard NeRF models using NeRF Studio on the same dataset without any issues. Therefore, I am wondering whether 32GB of RAM is insufficient for training LERF models, or if I might be missing a specific command necessary for LERF training. I have also tried reducing the number of images in my dataset, but this approach did not resolve the issue. The command I am using for training is:
ns-train lerf --output-dir --data data/colmap/
Thank you so much for your help.
The text was updated successfully, but these errors were encountered: