Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When to determine rank in dense pivoted QR? #514

Open
andreasnoack opened this issue Mar 23, 2018 · 2 comments
Open

When to determine rank in dense pivoted QR? #514

andreasnoack opened this issue Mar 23, 2018 · 2 comments
Labels
needs decision A decision on this change is needed

Comments

@andreasnoack
Copy link
Member

Continues the discussion in JuliaLang/julia#26392. Currently, there is a difference between when the rank is determined in sparse and dense pivoted QR. For our LAPACK based dense pivoted QR, the rank is not determined as part of the factorization but instead as part of the least squares solver. In contrast, the SuiteSparse based sparse pivoted QR determines the rank as part of the factorization. As a consequence, qrfact(SparseMatrixCSC) takes a (currently undocumented, see https://github.com/JuliaLang/julia/issues/26592) tol argument whereas the tolerance in the dense case is specified in ldiv! (also undocumented). Hence, the size of F.R will depend on the rank in the sparse case but not in the dense where ldiv! will do the truncation of R.

The questions is if we should move the rank determination to the factorization step in the dense case as well. This would also be consistent with pivoted Cholesky (i.e. LAPACK is inconsistent here.). In practice, it will require a minor non-disruptive modification of the QRPivoted structure but it will probably simplify ldiv! a bit as well.

@KlausC
Copy link
Contributor

KlausC commented Mar 23, 2018

As mentioned in #26410 there is some overlap there.
In the view of uniform API (sparse and dense case), it would be good to adapt the dense to the sparse case.
In the SPQR algorithms tol is needed during factorization and cannot be avoided. In the LAPACK algorithm with column re-ordering, it is possible to postpone the evaluation of the rank and singularity to after the factorization. I don't think, though, that ldiv! is the correct place to do that, because

  1. rank - and singularity considerations are independent of the rhs of the evaluation
  2. the typical idiom of applying the factorization to a rhs qrf \ rhs does not allow to specify a third parameter.

qrfact(SparseMatrixCSC) takes a (currently undocumented, see #26592) tol argument

My approach would be to change the meaning of tol to be a relative tolerance (which would internally be multiplied by an appropriate measure for the size of the matrix, which scales well). At several places (pinv, rank) it is a relative tolerance, for cholfact it is absolute and passed down to LAPACK. Also in the last case, the meaning of tolis not yet documented completely.

@chriscoey
Copy link

I'm with @KlausC on this. In my code I need a rank-revealing QR of a matrix that could be sparse or dense (not for solving a linear system; I need information from Q and R).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs decision A decision on this change is needed
Projects
None yet
Development

No branches or pull requests

4 participants