You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If we have f: R^n ---> R^m, then Jac(f) is a m x n matrix, and Hess(f) is a tensor of order 3 (m x n x n). Currently, Hessian matrices are stored as 2D matrices, thus supposing that we deal with scalar-valued functions (m = 1).
Actually if you use the same strategy than impl_gradient then impl_hessian can be represented as matrices. I.e. you consider than non-scalar functions are the concatenation of n scalar functions and you choose in the list using an index (cf. impl_gradient prototype).
I would go for this first. One reason would be that it keeps the interface consistent and use matrices only.
Yes, I was thinking of using a better matrix type for so far I didn't implement this.
Another alternative would be to use the Eigen unsupported module for tensors representation. Right now, I don't know enough on this module to really want to rely on it but it is worth mentioning it anyway.
I've never tried Eigen's unsupported tensors either. I guess they're still doing lots of work on it, so the API will probably change quite a lot in the near future. And apparently they're adding support for symmetries.
I guess we can indeed start concatenating everything into a 2D matrix, and leave any optimization for later if someone really uses that feature.
If we have
f: R^n ---> R^m
, thenJac(f)
is am x n
matrix, andHess(f)
is a tensor of order 3 (m x n x n
). Currently, Hessian matrices are stored as 2D matrices, thus supposing that we deal with scalar-valued functions (m = 1
).Note that Hessian matrices are symmetric, so we could maybe use Eigen's self-adjoint view.
The text was updated successfully, but these errors were encountered: