Deviation Bounds for Wavelet Shrinkage
Dawei Hong, Jean-Camille Birget
TL;DR
This work establishes non-asymptotic deviation bounds for the Donoho–Johnstone wavelet shrinkage estimator under Hölder smoothness of the signal and independent, bounded noise. By leveraging Talagrand's isoperimetric inequality, the authors construct high-probability events controlling noise amplitudes across wavelet scales and prove bounds for both the maximum squared error and the mean squared error of the estimator, with rate $\left(\frac{\log n}{n}\right)^{\frac{2\alpha}{1+2\alpha}}$. The results apply to Haar and CDJV interval wavelets and require thresholds of the form $\lambda_{n,\delta}=C_{\varphi} b\left(1+2\sqrt{(1+\delta)\ln 2}\right)\sqrt{\frac{\log n}{n}}$, yielding explicit finite-sample probabilities $\ge 1-\frac{9}{n^{1+\delta}}$. This extends prior expectations-based analyses to deviation guarantees and broadens applicability to bounded, non-Gaussian noise, offering practical probabilistic performance assurances for wavelet shrinkage in signal reconstruction.
Abstract
We analyse the wavelet shrinkage algorithm of Donoho and Johnstone in order to assess the quality of the reconstruction of a signal obtained from noisy samples. We prove deviation bounds for the maximum of the squares of the error, and for the average of the squares of the error, under the assumption that the signal comes from a H"older class, and the noise samples are independent, of 0 mean, and bounded. Our main technique is Talgrand's isoperimetric theorem. Our bounds refine the known expectations for the average of the squares of the error.
