-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LKJ follow-up #134
Comments
Additionally we might want to:
|
This has now been done in #246 👍 |
Worth pointing out that the comments on numerical issues in TuringLang/DynamicPPL.jl#485 are talking about ForwardDiff, so would be curious to see whether usage of the |
The issue with ForwardDiff was more about correctness there, as I was not expecting a Dual with a single partial when differentiating wrt LKJ samples. We get numerical issues without ForwardDiff as well, that probably have to do with the numerical stability of the inverse link Bijectors.jl/src/bijectors/corr.jl Lines 443 to 464 in 03bdffb
Numerical issues directly related with ForwardDiff were observed in #253 (comment) , which we treated by explicitly wrapping matrix with |
Just adding this here as well to move the discussion from the DynamicPPL PR. The numerical issues when sampling seem to be ForwardDiff-related indeed. See TuringLang/DynamicPPL.jl#485 (comment) for a simple test case, which gives loads of numerical issues when sampling with NUTS and Duals are passed through, but it seems to work fine with ReverseDiff. |
Some things that probably can be improved as a follow-up to the initial implementation (but might require changes in other packages such as AD backends):
The text was updated successfully, but these errors were encountered: