Table of Contents
Fetching ...

Multidimensional Gradient-MUSIC: A Global Nonconvex Optimization Framework for Optimal Resolution

Albert Fannjiang, Weilin Li

Abstract

We develop a multidimensional version of Gradient-MUSIC for estimating the frequencies of a nonharmonic signal from noisy samples. The guiding principle is that frequency recovery should be based only on the signal subspace determined by the data. From this viewpoint, the MUSIC functional is an economical nonconvex objective encoding the relevant information, and the problem becomes one of understanding the geometry of its perturbed landscape. Our main contribution is a general structural theory showing that, under explicit conditions on the measurement kernel and the perturbation of the signal subspace, the perturbed MUSIC function is an admissible optimization landscape: suitable initial points can be found efficiently by coarse thresholding, gradient descent converges to the relevant local minima, and these minima obey quantitative error bounds. Thus the theory is not merely existential; it provides a constructive global optimization framework for multidimensional optimal resolution. We verify the abstract conditions in detail for two canonical sampling geometries: discrete samples on a cube and continuous samples on a ball. In both cases we obtain uniform, nonasymptotic recovery guarantees under deterministic as well as stochastic noise. In particular, for lattice samples in a cube of side length $4m$, if the true frequencies are separated by at least $β_d/m$ and the noise has $\ell^\infty$ norm at most $\varepsilon$, then Gradient-MUSIC recovers the frequencies with error at most \[ C_d \frac{\varepsilon}{m}, \] where $C_d, β_d>0$ depend only on the dimension. This scaling is minimax optimal in $m$ and $\varepsilon$. Under stationary Gaussian noise, the error improves to \[ C_d\frac{σ\sqrt{\log(m)}}{m^{1+d/2}}. \] This is the noisy super-resolution scaling: (see paper for rest of abstract)

Multidimensional Gradient-MUSIC: A Global Nonconvex Optimization Framework for Optimal Resolution

Abstract

We develop a multidimensional version of Gradient-MUSIC for estimating the frequencies of a nonharmonic signal from noisy samples. The guiding principle is that frequency recovery should be based only on the signal subspace determined by the data. From this viewpoint, the MUSIC functional is an economical nonconvex objective encoding the relevant information, and the problem becomes one of understanding the geometry of its perturbed landscape. Our main contribution is a general structural theory showing that, under explicit conditions on the measurement kernel and the perturbation of the signal subspace, the perturbed MUSIC function is an admissible optimization landscape: suitable initial points can be found efficiently by coarse thresholding, gradient descent converges to the relevant local minima, and these minima obey quantitative error bounds. Thus the theory is not merely existential; it provides a constructive global optimization framework for multidimensional optimal resolution. We verify the abstract conditions in detail for two canonical sampling geometries: discrete samples on a cube and continuous samples on a ball. In both cases we obtain uniform, nonasymptotic recovery guarantees under deterministic as well as stochastic noise. In particular, for lattice samples in a cube of side length , if the true frequencies are separated by at least and the noise has norm at most , then Gradient-MUSIC recovers the frequencies with error at most where depend only on the dimension. This scaling is minimax optimal in and . Under stationary Gaussian noise, the error improves to This is the noisy super-resolution scaling: (see paper for rest of abstract)

Paper Structure

This paper contains 46 sections, 24 theorems, 348 equations, 4 figures, 2 tables, 1 algorithm.

Key Result

Lemma 3.1

Suppose that $f\colon \Omega\to [0,\infty)$ is an admissible optimization landscape for $\{\theta_\ell\}_{\ell=1}^s$ with parameters $(\varepsilon,\rho,\alpha_0,\alpha_1)$. Let $G\subseteq \Omega$ be a finite set satisfying Then for every $\ell\in\{1,\dots,s\}$,

Figures (4)

  • Figure 1: Contour plot of a perturbed MUSIC function $\widetilde{q}$. The red dots indicate the true parameters $\{\theta_\ell\}_{\ell=1}^{16}$, assumed separated by at least $1/8$, while the amplitudes $\{a_\ell\}_{\ell=1}^{16}$ are chosen independently from $\{\pm1\}$. Samples are collected on the discrete square $Q_{10}\cap\mathbb{Z}^2$ and corrupted by i.i.d. Gaussian noise with mean zero and variance one. The smallest local minima of the MUSIC function closely track the true parameter configuration.
  • Figure 2: The kernel in Example 1 is a $d$-fold tensor product of a normalized Dirichlet kernel, shown on the left. The kernel in Example 2 is a radial function whose radial profile is a weighted Bessel function whose order depends on the dimension $d$, shown on the right.
  • Figure 3: Accuracy of Gradient-MUSIC versus $m$ for three types of nonstationary independent Gaussian noise with growth parameter $r\in \{-1/2,0,1/2\}$. The 90-th percentile error is recorded over 10 realizations of $\eta$ per parameter pair $m$ and $r$. The dashed black lines are $0.01 \cdot m^\alpha$ for $\alpha \in\{-3/2, -2, -5/2\}$.
  • Figure 4: Decomposition of $\mathbb{T}^2$ into $A_{0,0}$, $A_{1,0}$, $A_{0,1}$, and $A_{1,1}$.

Theorems & Definitions (48)

  • Definition 2.1: Spectral estimation problem
  • Definition 2.2: MUSIC function
  • Definition 2.3: Measurement kernel
  • Remark 2.1
  • Definition 3.1: Admissible optimization landscape
  • Lemma 3.1: Thresholding yields valid initialization
  • proof
  • Theorem 3.2: The perturbed MUSIC function is an admissible landscape
  • Lemma 3.3
  • Theorem 4.1: Gradient-MUSIC for discrete samples in a cube
  • ...and 38 more