Table of Contents
Fetching ...

Weight distribution bounds to relate minimum distance, list decoding, and symmetric channel performance

Donald Kougang-Yombi, Jan Hązła

Abstract

We study relationships between worst-case and random-noise properties of error correcting codes. More concretely, we consider connections between minimum distance, list decoding radius, and block error probability on noisy channels. A recent result of Pernice, Sprumont, and Wootters established the tight connection between list decoding radius and symmetric channel performance for linear codes. We extend this result to general codes. The proof proceeds by directly bounding the weight distribution rather than by sharp threshold techniques. We then turn to the relation between minimum distance and symmetric channel performance. The results we just described imply that a $q$-ary code of relative distance $δ$ has vanishing error probability on the symmetric channel up to the Johnson radius $J_q(δ)$. We improve upon this bound in the case of linear codes, for a range $δ$, for $q\ge 4$. In our proof we consider the \emph{erasure} properties of codes, and bound their weight distribution through inequalities introduced by Samorodnitsky. The latter part of the proof gives a more general technique that bounds the symmetric channel performance of a linear code with constant relative distance and good erasure channel performance.

Weight distribution bounds to relate minimum distance, list decoding, and symmetric channel performance

Abstract

We study relationships between worst-case and random-noise properties of error correcting codes. More concretely, we consider connections between minimum distance, list decoding radius, and block error probability on noisy channels. A recent result of Pernice, Sprumont, and Wootters established the tight connection between list decoding radius and symmetric channel performance for linear codes. We extend this result to general codes. The proof proceeds by directly bounding the weight distribution rather than by sharp threshold techniques. We then turn to the relation between minimum distance and symmetric channel performance. The results we just described imply that a -ary code of relative distance has vanishing error probability on the symmetric channel up to the Johnson radius . We improve upon this bound in the case of linear codes, for a range , for . In our proof we consider the \emph{erasure} properties of codes, and bound their weight distribution through inequalities introduced by Samorodnitsky. The latter part of the proof gives a more general technique that bounds the symmetric channel performance of a linear code with constant relative distance and good erasure channel performance.

Paper Structure

This paper contains 45 sections, 28 theorems, 126 equations, 9 figures.

Key Result

Theorem 1

Let $0<p\le 1-1/q$ and $\{\mathcal{C}_n\}_n\subseteq\Sigma^n$ be a family of $q$-ary codes which is $(p,L)$-list decodable for some $L=L(n)$ satisfying $d(\mathcal{C}_n)=\omega(\log(nL))$. Then, $\lim_{n\to\infty} P_B(\mathcal{C}_n,\mathop{\mathrm{qSC}}\nolimits_{p'})=0$ for every $p'<p$. $\blacktri

Figures (9)

  • Figure 1: Our lower bound $p_*^{(q)}(\frac{q\delta}{q-1},\delta)$ from \ref{['thm:unconditional_bound']} plotted against the Johnson bound for $q=9$ and $q=17$. Our bound is larger for larger values of $\delta$. Also the upper bound of $\delta$ and the lower bound of $\delta/2$ are plotted for illustration.
  • Figure 2: Binary case $q=2$, plot of $F_\lambda(\gamma, p)\coloneqq F(\gamma,p)+\gamma\log\left(2^\lambda-1\right)$ as a function of $\gamma$, for $\lambda=0.5$ and some values of $p$. Our results imply vanishing error probability on $\mathop{\mathrm{BSC}}\nolimits_p$ whenever $F_\lambda(\delta, p)>0$, assuming relative distance $\delta$ and vanishing error probability on $\mathop{\mathrm{BEC}}\nolimits_\lambda$.
  • Figure 3: Binary case $q=2$, plot of $p_*(\lambda,\delta)$ as a function of the BEC erasure probability $\lambda$, for fixed relative distances $\delta\in\{0.05, 0.1, 0.2,0.4\}$. For comparison, we plot the threshold $p(\lambda)=\frac{1}{2}-\sqrt{2^{\lambda-1}(1-2^{\lambda-1})}$ from hkazla2021codes which does not require the assumption of $\delta>0$. In the intervals where the graphs are constant, our bound matches the trivial bound $p_*(\lambda,\delta)\ge \delta/2$.
  • Figure 4: Binary case $q=2$, $p_*(\lambda,\delta)$ as a function of the relative distance $\delta$, for fixed erasure probabilities $\lambda\in\{0.25, 0.5, 0.7, 0.8\}$. Since a code family with relative distance $\delta$ has list decoding radius $J_2(\delta)$, by \ref{['thm:list-vs-qsc-main']} it also has vanishing symmetric channel error probability for $p<J_2(\delta)$. Our bounds improve upon this result under the additional BEC assumption whenever $p_*(\lambda,\delta)>J_2(\delta)$.
  • Figure 5: The threshold $p_*^{(q)}(\lambda,\delta)$ as a function of the relative distance $\delta$, for fixed erasure probability $\lambda=0.6$ and $q\in\{3,5,7,16\}$. We also provide the trivial bound $\delta/2$. (As in \ref{['fig:delta_for_fixed_lambda']} our bound can be compared with the Johnson radius $J_q(\delta)$. We omit this in the interest of visual clarity.)
  • ...and 4 more figures

Theorems & Definitions (58)

  • Theorem 1
  • Theorem 2
  • proof
  • Theorem 3: hkazla2021codesabawonse2025generalized
  • Definition 4
  • Definition 5
  • Theorem 6
  • Theorem 7
  • Claim 8
  • Definition 9
  • ...and 48 more