Table of Contents
Fetching ...

Structure, Analysis, and Synthesis of First-Order Algorithms

Jared Miller, Carsten Scherer, Fabian Jakob, Andrea Iannelli

Abstract

Optimization algorithms can be interpreted through the lens of dynamical systems as the interconnection of linear systems and a set of subgradient nonlinearities. This dynamical systems formulation allows for the analysis and synthesis of optimization algorithms by solving robust control problems. In this work, we use the celebrated internal model principle in control theory to structurally factorize convergent composite optimization algorithms into suitable network-dependent internal models and core subcontrollers. As the key benefit, we reveal that this permits us to synthesize optimization algorithms even if information is transmitted over networks featuring dynamical phenomena such as time delays, channel memory, or crosstalk. Design of these algorithms is achieved under bisection in the exponential convergence rate either through a nonconvex local search or by alternation of convex semidefinite programs. We demonstrate factorization of existing optimization algorithms and the automated synthesis of new optimization algorithms in the networked setting.

Structure, Analysis, and Synthesis of First-Order Algorithms

Abstract

Optimization algorithms can be interpreted through the lens of dynamical systems as the interconnection of linear systems and a set of subgradient nonlinearities. This dynamical systems formulation allows for the analysis and synthesis of optimization algorithms by solving robust control problems. In this work, we use the celebrated internal model principle in control theory to structurally factorize convergent composite optimization algorithms into suitable network-dependent internal models and core subcontrollers. As the key benefit, we reveal that this permits us to synthesize optimization algorithms even if information is transmitted over networks featuring dynamical phenomena such as time delays, channel memory, or crosstalk. Design of these algorithms is achieved under bisection in the exponential convergence rate either through a nonconvex local search or by alternation of convex semidefinite programs. We demonstrate factorization of existing optimization algorithms and the automated synthesis of new optimization algorithms in the networked setting.

Paper Structure

This paper contains 63 sections, 37 theorems, 192 equations, 27 figures, 6 tables.

Key Result

Proposition 2.1

If the algorithm eq:algorithm is well-posed and there exists some vector $x^*$ with $\lim_{k\to\infty}x_k=x^*$ for all initial conditions $x_0$, then eq:algorithm admits a unique fixed point $(x^*, z^*, w^*)$ and all its trajectories satisfy $\lim_{k\to\infty} (x_k,w_k,z_k)=(x^*, w^*, z^*)$.

Figures (27)

  • Figure 1: Contours plotted of functions in $\mathcal{S}_{m, L}$ with $c=2$.
  • Figure 2: Functions $g_m$ in \ref{['eq:g_sml']} and vectors inside $\partial g_m(2.1)$ for $m \in \{-1, 0\}$.
  • Figure 3: Standard algorithmic interconnection in \ref{['eq:algorithm']}
  • Figure 4: Block diagrams of an algorithm over a network given by $F \star (P \star {\color[rgb]{0,0.7,0}K})$ (left) and its closed-loop representation $F \star G$ (right) with $G = P \star K$ represented by $({\cal A}, {\cal B}, {\cal C}, {\cal D})$.
  • Figure 5: Proximal point method in \ref{['eq:prox_point']} delayed by one time step (left) and the network representation (right)
  • ...and 22 more figures

Theorems & Definitions (88)

  • Definition 1
  • Definition 2
  • Definition 3
  • Remark 2.1
  • Definition 4
  • Proposition 2.1
  • proof
  • Lemma 2.1
  • proof
  • Definition 5
  • ...and 78 more