You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
I want to use my own dataset, which consists of an unbounded forward facing scene, but I don't know how to choose the scale and offset to use in the transforms.json file (my values are in meters). I also saw that the depth map was used. Was this done with instant-ngp depth supervision?
I tried to use my own depth map but I think the value of integer_depth_scale is not the right one. I just used max_value/65535 for this but I think the max_value should also be converted to another unit. I would be grateful for any feedback or ideas you may have.
The text was updated successfully, but these errors were encountered:
If I'm not mistaken, instant-ngp is just not capable of accurately dealing with infinite-depths. This was also always clear when using the lone monk scene fro the paper: the blue sky just floats a few meters above the courtyard: it looks good from the viewpoints in the dataset, but when you move around a bit further, you start seeing that it's not actually in the right place.
Hello,
I want to use my own dataset, which consists of an unbounded forward facing scene, but I don't know how to choose the scale and offset to use in the transforms.json file (my values are in meters). I also saw that the depth map was used. Was this done with instant-ngp depth supervision?
I tried to use my own depth map but I think the value of integer_depth_scale is not the right one. I just used max_value/65535 for this but I think the max_value should also be converted to another unit. I would be grateful for any feedback or ideas you may have.
The text was updated successfully, but these errors were encountered: