Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Improve linear algebra functions #14962

Open
arcadiaphy opened this issue May 15, 2019 · 7 comments
Open

Improve linear algebra functions #14962

arcadiaphy opened this issue May 15, 2019 · 7 comments

Comments

@arcadiaphy
Copy link
Member

arcadiaphy commented May 15, 2019

One of my projects needs to do matrix inversion which is not very accurate and convenient to do with Cholesky factorization, so I decided to implement myself and improve the linalg package in the same time. New operators to add:

operator function progress
inverse matrix inversion #14963
det matrix determinant #15007
slogdet signed log determinant #15007
svd singular value decomposition
pinverse pseudo-inverse (Moore-Penrose inverse)

Pytorch use MAGMA in their implementation, I'll try to only use cuBLAS and cuSolver in order not to add dependencies.

@mxnet-label-bot
Copy link
Contributor

Hey, this is the MXNet Label Bot.
Thank you for submitting the issue! I will try and suggest some labels so that the appropriate MXNet community members can help resolve it.
Here are my recommended labels: Feature

@vdantu
Copy link
Contributor

vdantu commented May 15, 2019

@mxnet-label-bot add [Feature Request]

@zboldyga
Copy link
Contributor

@mseeger @asmushetzel I noticed you both had insightful comments on a related PR: #15007

One thing I haven't seen validated yet related to this issue (#14962) is whether MXNet contributors feel SVD and the Moore-Penrose inverse should be exposed via MXNet?

At a glance, libraries like NumPy, SciPy, and Tensorflow are doing this. From an adoptability standpoint alone, it seems like a good idea to support these ops.

@mseeger
Copy link
Contributor

mseeger commented Jul 19, 2019

Hello, in fact we are starting work on the SVD, will come soon. With that, you can do the others.
BTW: What best use cases for SVD in a NN library do you know? Just curious. One reason we did not work on this sooner is that we lacked the use cases.

@mseeger
Copy link
Contributor

mseeger commented Jul 19, 2019

IMHO, one should not implement all sorts of derived computations as operators (inverse, logdet, det, ...). Just as we implement conv2d as op, but then resnet-cell as HybridBlock. It is seriously inefficient to recompute decompositions several times, just because you don't understand what is really happening inside.

@zboldyga
Copy link
Contributor

@mseeger I agree with your sentiment overall. Users shouldn't wander into using these multi-step ops without a warning, and the likelihood these multi-step ops are actually needed is tiny. Personally I don't have a use for them, I'm just following up on loose ends from other user-reported issues in #14360.

I think the most valid argument I've been reading is that every other numerical computation library supports the functions, and just for the sake of porting code across libraries it would make sense to have an equivalent in MXNet. If there is no one-to-one translation, or at least information on how to implement this op in MXNet, anyone without some fluency in linear algebra may get stuck.

That said, I'm not necessarily arguing that these multi-step ops need to be implemented. Even if there is just a 1-2 sentence blurb in the documentation describing why the op isn't supported and how to use the lower-level ops to get the desired result, I think that would satisfy anyone porting code.

@zboldyga
Copy link
Contributor

Regarding svd and pinv: if you implement svd, a short stub in the documentation describing why pinv isn't provided and how to use svd to get pinv might be useful. I can add that if you like.

This same paradigm could also be used for the more complex ops supported by other frameworks. I'm not really taking a strong stand on this one, just suggesting this as another option!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

6 participants