Table of Contents
Fetching ...

Nearly-Linear Time Algorithms for Preconditioning and Solving Symmetric, Diagonally Dominant Linear Systems

Daniel A. Spielman, Shang-Hua Teng

TL;DR

This work addresses efficiently solving linear systems in symmetric, weakly diagonally dominant matrices ($\mathrm{SDD}_{0}$) by developing a multilevel solver that leverages ultra-sparsifiers and subgraph preconditioners. The core idea is to precondition the system with a hierarchy of increasingly sparse Laplacian-inspired matrices, solved recursively via partial Cholesky factorizations and a fixed-budget preconditioned Chebyshev scheme. Two principal contributions stand out: (1) a detailed analysis of a recursive solver that achieves near-linear time for general $\mathrm{SDD}_{0}$ systems, and (2) a construction of ultra-sparsifiers (and their planar variants) that yield good finite generalized condition numbers $\kappa_f(A,B)$ while keeping the off-diagonal sparsity under control. The resulting algorithms enable nearly-linear-time solvers (and approximate Fiedler vector computation) with practical applicability to graph-based problems and elliptic PDE discretizations, and the planarity-focused results offer concrete, implementable time bounds like $O(n \log^{2} n + n \log n \log \log n \log(1/\epsilon))$ for planar cases. Together, these advances push toward fast, scalable solvers for a broad class of sparse, structured linear systems and their eigenvector analytics.

Abstract

We present a randomized algorithm that, on input a symmetric, weakly diagonally dominant n-by-n matrix A with m nonzero entries and an n-vector b, produces a y such that $\norm{y - \pinv{A} b}_{A} \leq ε\norm{\pinv{A} b}_{A}$ in expected time $O (m \log^{c}n \log (1/ε)),$ for some constant c. By applying this algorithm inside the inverse power method, we compute approximate Fiedler vectors in a similar amount of time. The algorithm applies subgraph preconditioners in a recursive fashion. These preconditioners improve upon the subgraph preconditioners first introduced by Vaidya (1990). For any symmetric, weakly diagonally-dominant matrix A with non-positive off-diagonal entries and $k \geq 1$, we construct in time $O (m \log^{c} n)$ a preconditioner B of A with at most $2 (n - 1) + O ((m/k) \log^{39} n)$ nonzero off-diagonal entries such that the finite generalized condition number $κ_{f} (A,B)$ is at most k, for some other constant c. In the special case when the nonzero structure of the matrix is planar the corresponding linear system solver runs in expected time $ O (n \log^{2} n + n \log n \ \log \log n \ \log (1/ε))$. We hope that our introduction of algorithms of low asymptotic complexity will lead to the development of algorithms that are also fast in practice.

Nearly-Linear Time Algorithms for Preconditioning and Solving Symmetric, Diagonally Dominant Linear Systems

TL;DR

This work addresses efficiently solving linear systems in symmetric, weakly diagonally dominant matrices () by developing a multilevel solver that leverages ultra-sparsifiers and subgraph preconditioners. The core idea is to precondition the system with a hierarchy of increasingly sparse Laplacian-inspired matrices, solved recursively via partial Cholesky factorizations and a fixed-budget preconditioned Chebyshev scheme. Two principal contributions stand out: (1) a detailed analysis of a recursive solver that achieves near-linear time for general systems, and (2) a construction of ultra-sparsifiers (and their planar variants) that yield good finite generalized condition numbers while keeping the off-diagonal sparsity under control. The resulting algorithms enable nearly-linear-time solvers (and approximate Fiedler vector computation) with practical applicability to graph-based problems and elliptic PDE discretizations, and the planarity-focused results offer concrete, implementable time bounds like for planar cases. Together, these advances push toward fast, scalable solvers for a broad class of sparse, structured linear systems and their eigenvector analytics.

Abstract

We present a randomized algorithm that, on input a symmetric, weakly diagonally dominant n-by-n matrix A with m nonzero entries and an n-vector b, produces a y such that in expected time for some constant c. By applying this algorithm inside the inverse power method, we compute approximate Fiedler vectors in a similar amount of time. The algorithm applies subgraph preconditioners in a recursive fashion. These preconditioners improve upon the subgraph preconditioners first introduced by Vaidya (1990). For any symmetric, weakly diagonally-dominant matrix A with non-positive off-diagonal entries and , we construct in time a preconditioner B of A with at most nonzero off-diagonal entries such that the finite generalized condition number is at most k, for some other constant c. In the special case when the nonzero structure of the matrix is planar the corresponding linear system solver runs in expected time . We hope that our introduction of algorithms of low asymptotic complexity will lead to the development of algorithms that are also fast in practice.

Paper Structure

This paper contains 25 sections, 35 theorems, 232 equations, 3 figures.

Key Result

Theorem 1.3

On input an $n\times n$ Laplacian matrix $A$ with $2m$ nonzero off-diagonal entries and a $p > 0$, Sparsify2 runs in expected time $O (m \log (1/p) \log^{17} n)$ and with probability at least $1-p$ produces a $c_{1} \log^{c_{2}} (n/p)$-sparsifier of $A$, for $c_{2} = 34$ and some absolute constant $

Figures (3)

  • Figure 1: A Laplacian matrix and its corresponding weighted graph.
  • Figure 2: An example of a tree decomposition. Note that sets $W_{1}$ and $W_{6}$ overlap, and that set $W_{5}$ is a singleton set and that it overlaps $W_{4}$.
  • Figure 3: In this example, $e = w \pmb{\boldsymbol{(}} u,v \pmb{\boldsymbol{)}}$ and $\tau (e) = z \pmb{\boldsymbol{(}} x,y \pmb{\boldsymbol{)}}$.

Theorems & Definitions (66)

  • Definition 1.1: Ultra-Sparsifiers
  • Definition 1.2: Spectral Sparsifiers
  • Theorem 1.3: Spectral Sparsification
  • Proposition 3.1
  • Proposition 3.2
  • Proposition 3.3
  • Proposition 3.4: SupportGraph, Lemma 2.5
  • Proposition 3.5
  • Proposition 3.6
  • Proposition 4.1: Partial Cholesky Factorization
  • ...and 56 more