Table of Contents
Fetching ...

The Riemannian Landing Method: From projected gradient flows to SQP

Florentin Goyens, Florian Feppon

Abstract

Landing methods have recently emerged in Riemannian matrix optimization as efficient schemes for handling nonlinear equality constraints without resorting to costly retractions. These methods decompose the search direction into tangent and normal components, enabling asymptotic feasibility while maintaining inexpensive updates. In this work, we provide a unifying geometric framework which reveals that, under suitable choices of Riemannian metric, the landing algorithm encompasses several classical optimization methods such as projected and null-space gradient flows, Sequential Quadratic Programming (SQP), and a certain form of the augmented Lagrangian method. In particular, we show that a quadratically convergent landing method essentially reproduces the quadratically convergent SQP method. These connections also allow us to propose a globally convergent landing method using adaptive step sizes. The backtracking line search satisfies an Armijo condition on a merit function, and does not require prior knowledge of Lipschitz constants. Our second key contribution is to analyze landing methods through a geometric parameterization of the metric in terms of fields of oblique projectors and associated metric restrictions. This viewpoint disentangles the roles of orthogonality, tangent and normal metrics, and elucidates how to design the metric to obtain explicit tangent and normal updates. For matrix optimization, this framework not only recovers recent constructions in the literature for problems with orthogonality constraints, but also provides systematic guidelines for designing new metrics that admit closed-form search directions.

The Riemannian Landing Method: From projected gradient flows to SQP

Abstract

Landing methods have recently emerged in Riemannian matrix optimization as efficient schemes for handling nonlinear equality constraints without resorting to costly retractions. These methods decompose the search direction into tangent and normal components, enabling asymptotic feasibility while maintaining inexpensive updates. In this work, we provide a unifying geometric framework which reveals that, under suitable choices of Riemannian metric, the landing algorithm encompasses several classical optimization methods such as projected and null-space gradient flows, Sequential Quadratic Programming (SQP), and a certain form of the augmented Lagrangian method. In particular, we show that a quadratically convergent landing method essentially reproduces the quadratically convergent SQP method. These connections also allow us to propose a globally convergent landing method using adaptive step sizes. The backtracking line search satisfies an Armijo condition on a merit function, and does not require prior knowledge of Lipschitz constants. Our second key contribution is to analyze landing methods through a geometric parameterization of the metric in terms of fields of oblique projectors and associated metric restrictions. This viewpoint disentangles the roles of orthogonality, tangent and normal metrics, and elucidates how to design the metric to obtain explicit tangent and normal updates. For matrix optimization, this framework not only recovers recent constructions in the literature for problems with orthogonality constraints, but also provides systematic guidelines for designing new metrics that admit closed-form search directions.

Paper Structure

This paper contains 19 sections, 32 theorems, 224 equations, 1 algorithm.

Key Result

Lemma 4.1

Consider the landing scheme eq:landing with the tangent and normal terms calculated in two different metrics $g_1$ and $g_2$: There exists a common metric $g$ such that $d_T(x)=-\mathrm{grad}^{g}_{\mathcal{M}_x}f(x)$ and $d_N(x)=-\nabla_g \psi(x)$.

Theorems & Definitions (66)

  • Lemma 4.1
  • proof
  • Remark 4.1
  • Remark 4.2
  • proposition 1
  • proof
  • Remark 4.3
  • proposition 2
  • proof
  • proposition 3
  • ...and 56 more