We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I think that in vanilla_vae.py loss_function there's a mistake in KLD returned value:
loss_function
return {'loss': loss, 'Reconstruction_Loss':recons_loss.detach(), 'KLD':-kld_loss.detach()}
the negative sign (-) should not be there!
The text was updated successfully, but these errors were encountered:
Agree, the negative sign was already added before, so it should not be there
Sorry, something went wrong.
Agree. Line#143 already has one negative sign.
this also happens in the mssim_vae.py line#155 `
kld_loss = torch.mean(-0.5 * torch.sum(1 + log_var - mu ** 2 - log_var.exp(), dim = 1), dim = 0) loss = recons_loss + kld_weight * kld_loss return {'loss': loss, 'Reconstruction_Loss':recons_loss, 'KLD':-kld_loss}`
No branches or pull requests
I think that in vanilla_vae.py
loss_function
there's a mistake in KLD returned value:return {'loss': loss, 'Reconstruction_Loss':recons_loss.detach(), 'KLD':-kld_loss.detach()}
the negative sign (-) should not be there!
The text was updated successfully, but these errors were encountered: