Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better gradient interface #97

Closed
yebai opened this issue Mar 8, 2017 · 3 comments
Closed

Better gradient interface #97

yebai opened this issue Mar 8, 2017 · 3 comments

Comments

@yebai
Copy link
Member

yebai commented Mar 8, 2017

It would be nice if we can have a user-friendly interface for automatic differentiation, e.g.

@model gdemo(x) = begin   
  s ~ InverseGamma(2,3)
  m ~ Normal(0,sqrt(s))
  for i=1:length(x)
    x[i] ~ Normal(m, sqrt(s)) 
  end
  return(s, m, x)
end

g  = @gradient(gdemo([2.0, 3.0]), varInfo = nothing)

By default, the user does not need to pass in varInfo (i.e. differentiate through all parameters).

This would be very helpful for users who want to contribute new gradient-based inference methods.

@yebai yebai mentioned this issue Mar 8, 2017
15 tasks
@yebai yebai modified the milestone: Release 0.3 Mar 8, 2017
@xukai92
Copy link
Member

xukai92 commented Apr 11, 2017

I think this interface is actually done by the closure way of creating model. Like what we did in test file for AD, we can now use

∇E = gradient(vi, ad_test_f)

to get the gradient dictionary.

I guess user have to pass an VarInfo to it otherwise we don't know the derivative at which points we want to compute?

@yebai
Copy link
Member Author

yebai commented Apr 11, 2017

That's true. We can probably close this issue now.

@xukai92
Copy link
Member

xukai92 commented Apr 11, 2017

Cool.

@xukai92 xukai92 closed this as completed Apr 11, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants