Table of Contents
Fetching ...

High-Dimensional Signal Compression: Lattice Point Bounds and Metric Entropy

A. Iosevich, A. Vagharshakyan, E. Wyman

Abstract

We study worst-case signal compression under an $\ell^2$ energy constraint, with coordinate-dependent quantization precisions. The compression problem is reduced to counting lattice points in a diagonal ellipsoid. Under balanced precision profiles, we obtain explicit, dimension-dependent upper bounds on the logarithmic codebook size. The analysis refines Landau's classical lattice point estimates using uniform Bessel bounds due to Olenko and explicit Abel summation.

High-Dimensional Signal Compression: Lattice Point Bounds and Metric Entropy

Abstract

We study worst-case signal compression under an energy constraint, with coordinate-dependent quantization precisions. The compression problem is reduced to counting lattice points in a diagonal ellipsoid. Under balanced precision profiles, we obtain explicit, dimension-dependent upper bounds on the logarithmic codebook size. The analysis refines Landau's classical lattice point estimates using uniform Bessel bounds due to Olenko and explicit Abel summation.

Paper Structure

This paper contains 14 sections, 7 theorems, 82 equations.

Key Result

Theorem 2.4

Assume $k\geq 2$ and $R\geq 2$. Then where $O(1)$ is bounded as $k\to\infty$. If, in addition, Condition cond:balanced holds, then $\varepsilon_{\mathrm{geom}}$ is comparable to $\varepsilon_{\mathrm{total}}/k$, and mainbound can be rewritten as The implicit constants may depend on the constant $C$ in Condition cond:balanced. $\blacktriangleleft$$\blacktriangleleft$

Theorems & Definitions (15)

  • Definition 2.1: Ellipsoidal codebook size
  • Remark 2.3
  • Theorem 2.4
  • Corollary 2.5: Covering number interpretation
  • Theorem 6.1: Olenko
  • Lemma 6.2
  • Lemma 7.1
  • proof
  • Lemma 7.2
  • proof
  • ...and 5 more