You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This issue is based on a conversation I had with @ricardoV94, about whether it would be possible to add an integration Op. Based on this conversation, a Quad op with gradients should be possible to implement in PyTensor and JAX by directly wrapping scipy.integrate.quad (disclaimer: I am still not clear on nomenclature with respect to vjp, jvp, push foward, pull back, gradient, etc, but everything needed seems to be in this thread).
Numba will be tricker as usual, because of the spotty Numba coverage of scipy. Scipy uses QUADPACK, written in Fortran, to actually do the computation. I'm pretty sure this can be overloaded, but it would take a bit of tinkering.
The text was updated successfully, but these errors were encountered:
This issue is based on a conversation I had with @ricardoV94, about whether it would be possible to add an integration
Op
. Based on this conversation, aQuad
op with gradients should be possible to implement in PyTensor and JAX by directly wrappingscipy.integrate.quad
(disclaimer: I am still not clear on nomenclature with respect to vjp, jvp, push foward, pull back, gradient, etc, but everything needed seems to be in this thread).Numba will be tricker as usual, because of the spotty Numba coverage of scipy. Scipy uses QUADPACK, written in Fortran, to actually do the computation. I'm pretty sure this can be overloaded, but it would take a bit of tinkering.
The text was updated successfully, but these errors were encountered: