Table of Contents
Fetching ...

Generalized Entropy Power Inequalities and Monotonicity Properties of Information

Mokshay Madiman, Andrew Barron

TL;DR

This work generalizes entropy power and Fisher information inequalities to sums of independent variables across arbitrary collections of subset sums. By combining score-function projections, a variance-drop (ANOVA) lemma, and the de Bruijn link between information and entropy, it derives a broad family of subset-sum inequalities: $e^{2H(T_n)} \geq \frac{1}{r}\sum_{\mathbf{s}\in\mathcal{C}} e^{2H(T^{(\mathbf{s})})}$ and its Fisher-information analogue $\frac{1}{I(T_n)} \geq \frac{1}{r}\sum_{\mathbf{s}\in\mathcal{C}} \frac{1}{I(T^{(\mathbf{s})})}$. These results subsume Shannon’s classic EPI and ABBN’s leave-one-out inequality, establish entropy- and information-based monotonicity in the CLT (including on-average and variance-standardized forms for non-identical sums), and extend to refined forms via fractional packing. The framework yields both simple proofs and precise equality conditions (Gaussian case) and suggests broad applications in central limit theory, distributed estimation, and multi-user information theory.

Abstract

New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of $n$ independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in central limit theorems is obtained, both in the setting of i.i.d. summands as well as in the more general setting of independent summands with variance-standardized sums.

Generalized Entropy Power Inequalities and Monotonicity Properties of Information

TL;DR

This work generalizes entropy power and Fisher information inequalities to sums of independent variables across arbitrary collections of subset sums. By combining score-function projections, a variance-drop (ANOVA) lemma, and the de Bruijn link between information and entropy, it derives a broad family of subset-sum inequalities: and its Fisher-information analogue . These results subsume Shannon’s classic EPI and ABBN’s leave-one-out inequality, establish entropy- and information-based monotonicity in the CLT (including on-average and variance-standardized forms for non-identical sums), and extend to refined forms via fractional packing. The framework yields both simple proofs and precise equality conditions (Gaussian case) and suggests broad applications in central limit theory, distributed estimation, and multi-user information theory.

Abstract

New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in central limit theorems is obtained, both in the setting of i.i.d. summands as well as in the more general setting of independent summands with variance-standardized sums.

Paper Structure

This paper contains 10 sections, 14 theorems, 83 equations.

Key Result

Lemma 1

If $V_{1}$ and $V_{2}$ are independent random variables, and $V_{1}$ has an absolutely continuous density with score $\rho_{1}$, then $V_{1}+V_{2}$ has the score function

Theorems & Definitions (42)

  • Lemma 1: Convolution identity for scores
  • proof
  • Lemma 2: Variance drop
  • proof
  • Remark 1
  • Lemma 3
  • proof
  • Proposition 1: Monotonicity of Fisher information
  • proof
  • Theorem 1: Monotonicity of Entropy: IID Case
  • ...and 32 more