-
-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adds 3 basic implementations of approximations for gp.Latent #3
Conversation
At this point the ProjectedProcess / DTC and KarhunenLoeve implementations are feature complete I think (so not counting tests, docstrings). The HSGP one is trickier since it requires the spectral density of the kernel. I'm not sure if the API should take string arguments (ie, "matern32") for kernels with spectral densities, or whether to expect and check for specific covariance inputs (ie Also am not sure if there should be a factory function similar to |
|
||
class ProjectedProcess(pm.gp.Latent): | ||
## AKA: DTC | ||
def __init__(self, n_inducing, *, mean_func=pm.gp.mean.Zero(), cov_func=pm.gp.cov.Constant(0.0)): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would add type hints to public-facing API and at least simple docstrings. Can be done in a separate PR if you prefer.
@bwengals can you link to papers for the three approximations? Does one of them correspond to the method from https://arxiv.org/pdf/2002.09309.pdf ? I'm asking because I want to refer to these methods & implementations in my thesis... |
Also, can we merge this? |
Nice! |
There's a notebook with a basic 1D demo of each.
The Karhunen-Loeve expansion GP and the DTC approximation both can take arbitrary covariance functions, while HSGP only works for stationary kernels with a defined spectral density. What is the best way to interface this with the way the kernel library is designed?