Decoding by Linear Programming
Emmanuel Candes, Terence Tao
TL;DR
The paper shows that exact recovery of a signal f from corrupted measurements y = Af + e is possible via ℓ1 minimization when the error e is sparse and the coding matrix A (or its annihilator F) satisfies restricted orthogonality conditions. A dual certificate approach proves uniqueness of the ℓ1 minimizer, with deterministic guarantees under δ_S and θ_{S,S'} constraints, and Gaussian matrices are shown to satisfy these conditions with high probability for small sparsity levels. Numerical experiments confirm robust exact recovery up to substantial fractions of corrupted outputs, and connections to optimal recovery demonstrate near-optimal performance for compressible signals using the same linear programming framework. The results unify deterministic and probabilistic perspectives, extend to general coding matrices, and suggest practical decoding strategies with broad relevance to compressed sensing and error correction.
Abstract
This paper considers the classical error correcting problem which is frequently discussed in coding theory. We wish to recover an input vector $f \in \R^n$ from corrupted measurements $y = A f + e$. Here, $A$ is an $m$ by $n$ (coding) matrix and $e$ is an arbitrary and unknown vector of errors. Is it possible to recover $f$ exactly from the data $y$? We prove that under suitable conditions on the coding matrix $A$, the input $f$ is the unique solution to the $\ell_1$-minimization problem ($\|x\|_{\ell_1} := \sum_i |x_i|$) $$ \min_{g \in \R^n} \| y - Ag \|_{\ell_1} $$ provided that the support of the vector of errors is not too large, $\|e\|_{\ell_0} := |\{i : e_i \neq 0\}| \le ρ\cdot m$ for some $ρ> 0$. In short, $f$ can be recovered exactly by solving a simple convex optimization problem (which one can recast as a linear program). In addition, numerical experiments suggest that this recovery procedure works unreasonably well; $f$ is recovered exactly even in situations where a significant fraction of the output is corrupted.
