Eigenvalue Computation from the Optimization Perspective: On Jacobi-Davidson, IIGD, RQI, and Newton Updates
Yunkai Zhou
TL;DR
The paper ties eigenvalue computation to optimization through the Rayleigh quotient, deriving a Newton-based update class and unifying Jacobi-Davidson, IIGD, and Newton-type updates via Rayleigh quotient iteration (RQI). It introduces simplified correction equations (JDmeq and rqieq) that retain fast convergence while reducing inner solves, and proves that the augmented subspace can include the RQI direction, underpinning cubic convergence for normal matrices and quadratic convergence for nonnormal ones. Numerical experiments on Matrix Market problems confirm local fast convergence and show the practical benefits of the proposed simplifications and preconditioning for large-scale eigensolvers.
Abstract
We discuss the close connection between eigenvalue computation and optimization using the Newton method and subspace methods. From the connection we derive a new class of Newton updates. The new update formulation is similar to the well-known Jacobi-Davidson method. This similarity leads to simplified versions of the Jacobi-Davidson method and the inverse iteration generalized Davidson (IIGD) method. We prove that the projection subspace augmented by the updating direction from each of these methods is able to include the Rayleigh quotient iteration (RQI) direction. Hence, the locally quadratic (cubic for normal matrices) convergence rate of the RQI method is retained and strengthened by the subspace methods. The theory is supported by extensive numerical results. Preconditioned formulations are also briefly discussed for large scale eigenvalue problems.
