-
Notifications
You must be signed in to change notification settings - Fork 151
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Maximum-a-posterior estimate given a MultipleIndependent prior and MCMC sampler #841
Comments
Thanks for reporting, this indeed sounds like a bug. As a quick workaround, you could turn off the transformation for the from sbi.inference import likelihood_estimator_based_potential, MCMCPosterior
likelihood_estimator = inference.append_simulations(theta, x).train()
potential_fn, parameter_transform = likelihood_estimator_based_potential(
likelihood_estimator, prior, x_o
)
posterior = MCMCPosterior(
potential_fn, proposal=prior, theta_transform=None # instead of `parameter_transform`
)
map_estimate = posterior.set_default_x(x_o).map() |
Thank you for your response, as you suggested I set
|
Ah damn. Can you try setting |
Yes, it did work (consistently with up to |
The map is found by performing gradient ascent on You could also try Anyways, we will have to fix this (it's def a bug) but I won't get to it today. |
In doing so it happens that it occasionally selects values beyond the distribution's support:
|
Hello,
I am encountering an error when trying to estimate the maximum-a-posterior given an MCMCPosterior: slice_np_vectorized, and a MultipleIndependent prior: created via
process_prior
given a list of PyTorch distributions liketorch.distributions.HalfNormal(scale=torch.ones(1)*2)
given one single roundThe trace of the error:
I think that this error is related to #650 (this time with MCMC). However, I'm unable to grasp the details discussed, making it difficult for me to detect the problem in my code.
The text was updated successfully, but these errors were encountered: