A Near-Optimal Total Complexity for the Inexact Accelerated Proximal Gradient Method via Quadratic Growth
Hongda Li, Xianfu Wang
Abstract
We consider the optimization problem $\min_{x\in \mathbb R^n}{F(x):=f(x)+ω(Ax)}$, where $f$ is an $L$-Lipschitz smooth function, and $ω$ is a proper, lower semicontinuous, and convex function. We prove in this paper that when $ω$ is a conic polyhedral function, the inexact accelerated proximal gradient method (IAPG), employed in a double-loop structure, achieves a total complexity of $\mathcal O(\ln(1/\varepsilon)/\sqrt{\varepsilon})$ measured by the total number of calls to the proximal operator of the convex conjugate $ω^\star$ and the gradient of $f$ to achieve $\varepsilon$-optimality in function value. To the best of our knowledge, this improves upon the best-known complexity for IAPG. The key theoretical ingredient is a quadratic growth condition on the dual of the inexact proximal problem, which arises from the conic polyhedral structure of $ω$ and implies linear convergence of the inner proximal gradient loop. To validate these findings, we conduct numerical experiments on a robust TV-$\ell_2$ signal recovery problem, demonstrating fast convergence.
