Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix potensial bug: when n_importance not divided by up_sample_steps w… #30

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

murumura
Copy link

@murumura murumura commented Jan 7, 2022

Potential bug:
in renderer.py When self.n_importance is not divisible by self.up_sample_steps will cause line 365 in renderer.py reshape error due to shape differ.

For example when setting self.up_sample_steps=6 and self.n_importance=64 and self.n_samples=64 it will result tensor sdf in last iteration of up sample loop to have shape of (batch_size, 124) (since 64//6=10, we missing the non-zero remainder at each loop-iteration) instead of (batch_size, 128).

While in line 341 n_samples = self.n_samples + self.n_importance result 128 instead of 124, so this will make
s_val = ret_fine['s_val'].reshape(batch_size, n_samples).mean(dim=-1, keepdim=True) to cause a reshape error.

I simply add accumulative non-zero reminder to n_importance argument passed to self.up_sample at last iteration to resolve this bug.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant