Table of Contents
Fetching ...

Row-Splitting ILU Preconditioners for Sparse Least-Squares Problems

Jennifer Scott, Miroslav Tůma

Abstract

Preconditioning for overdetermined least-squares problems has received comparatively little attention, and designing methods that are both effective and memory-efficient remains challenging. We propose a class of ILU-based preconditioners built around a row-splitting strategy that identifies a well-conditioned square submatrix via an incomplete LU factorization and combines its incomplete factors with algebraic corrections from the remaining rows. This construction avoids forming the normal equations and is well suited to problems for which the normal matrix is ill-conditioned or relatively dense. Numerical experiments on test problems arising from practical applications illustrate the effectiveness of the proposed approach when used with a Krylov subspace solver and demonstrate it can outperform preconditioners based on incomplete Cholesky factorization of the normal equations, including for sparse-dense problems, where the splitting naturally isolates dense rows.

Row-Splitting ILU Preconditioners for Sparse Least-Squares Problems

Abstract

Preconditioning for overdetermined least-squares problems has received comparatively little attention, and designing methods that are both effective and memory-efficient remains challenging. We propose a class of ILU-based preconditioners built around a row-splitting strategy that identifies a well-conditioned square submatrix via an incomplete LU factorization and combines its incomplete factors with algebraic corrections from the remaining rows. This construction avoids forming the normal equations and is well suited to problems for which the normal matrix is ill-conditioned or relatively dense. Numerical experiments on test problems arising from practical applications illustrate the effectiveness of the proposed approach when used with a Krylov subspace solver and demonstrate it can outperform preconditioners based on incomplete Cholesky factorization of the normal equations, including for sparse-dense problems, where the splitting naturally isolates dense rows.

Paper Structure

This paper contains 13 sections, 1 theorem, 24 equations, 3 figures, 4 tables, 6 algorithms.

Key Result

Lemma 3.1

Assume a row splitting of the $m \times n$ ($m \ge n$) full rank matrix $A$ of the form (eq:splita), where the $k \times n$ ($k \ge n$) matrix $A_1$ is of full rank. Let $x^{(0)}$ be an approximate solution of the LLS problem (eq:ls) and let $r_1= b_1-A_1x^{(0)}$ and $r_2=b_2-A_2 x^{(0)}$ be the com with

Figures (3)

  • Figure 1: The convergence of the backward error estimate $ratio_{PT}$ for the problem 14 with preconditioner ILU(10,0.0). The matrix $\widetilde{S}$ is constructed and factorized as a dense matrix.
  • Figure 2: The convergence of the backward error estimate $ratio_{PT}$ for problems 5 and 13 with preconditioner ILU(10,0.0). The matrix $\widetilde{S}$ is constructed and factorized as a dense matrix.
  • Figure 3: The convergence of the backward error estimate $ratio_{PT}$ for problems 10 and 12 with preconditioner ILU(10,0.0). For problem 10 (left) the matrix $\widetilde{S}$ is constructed and factorized as a dense matrix; for problem 12 (right), Step 2 of Algorithm \ref{['alg:apply_ilu']} is approximated by two iterations of the conjugate gradient method.

Theorems & Definitions (2)

  • Lemma 3.1
  • proof