01 October 2011

Fastest Krylov method for symmetric indefinite problems?

One of my engineer correspondents asked what Krylov subspace method would be fastest for solving a symmetric indefinite problem: GMRES, a least-squares solver such as LSQR, or BiCG.  I've thought about this a lot for symmetric positive definite problems, but not so much for the indefinite case.  Here are my thoughts; i welcome correction or further suggestions.

BiCG reduces to a redundantly computed CG if the matrix is symmetric positive definite.  Thus, CG should always be faster than BiCG in that case.  I haven't thought about this question before for the case of a symmetric indefinite problem, but my guess is that MINRES would be faster than BiCG (or, say, QMR) in most cases.  MINRES promises monotonic convergence of the residual norm; BiCG doesn't.  Their storage requirements are comparable.

MINRES is a special case of GMRES for symmetric indefinite problems.  MINRES should be faster than GMRES, though MINRES is a bit less robust.

LSQR works a little bit like BiCG (it builds up two Krylov bases).  My guess is that MINRES would be faster than LSQR.

If the linear system is singular or very ill conditioned, one may have more success with LSQR than with GMRES, more success with GMRES than with MINRES, and more success with MINRES than SYMMLQ.  LSQR is able to solve rank-deficient problems in a least-squares sense.  GMRES and MINRES (depending on the implementation) may be able to solve rank-deficient problems in a least-squares sense.

On the other hand, "LSQR with preconditioning" means something rather different than users might think.

Alas, I want to write more, but time constrains me...

5 comments:

science-enthusiast said...

Hi, What is your suggestion for a positive semidefinite system (not strictly positive definite), which is highly ill-conditioned and rank deficient?

HilbertAstronaut said...


If you mean to solve the linear system in a least-squares sense, then use LSQR or LSMR:

http://web.stanford.edu/group/SOL/software/lsmr/

science-enthusiast said...

Thanks a lot for the reply. I am using an iterative solver as part of a truncated Newton method. The truncation rule is classical: a forcing sequence on the residual. Since, MINRES leads to a monotonically decreasing residual, I have seen some people use MINRES. However, people say that LSQR is mathematically equivalent to CG and you have indicated that it has better performance than MINRES. So, I would like to know what is more suitable.

HilbertAstronaut said...

LSQR should be mathematically equivalent to MINRES if the matrix is square and has full rank. CG minimizes the A-norm of the error; MINRES minimizes the 2-norm of the residual.

If the matrix does not have full rank, then Newton's method will have problems. If your matrix is ill conditioned but normally has full rank, then you should use a preconditioner. A good preconditioner matters much more for reducing the iteration count than the choice of iterative solver.

science-enthusiast said...

The function is the sum of two log-sum-exp functions, which is not strictly convex. Hence, the hessian is not full rank. Also, its condition number, increases based on a parameter. I am seeing that trust region methods like Steihaug or by adding a damping matrix, helps me when the parameter is of low value. For larger values of the parameter, I am hoping a preconditioner will help.