Table of Contents
Fetching ...

Explicit Codes Achieving List Decoding Capacity: Error-correction with Optimal Redundancy

Venkatesan Guruswami, Atri Rudra

TL;DR

The paper presents explicit folded Reed-Solomon codes that achieve the list-decoding capacity up to an arbitrarily small loss ε for every rate R, with polynomial-time decoding. It introduces a multivariate interpolation framework (starting with trivariate and extending to s-variate) that enforces algebraic relations between a message polynomial f(X) and its shifted forms, enabling near-capacity error correction under adversarial noise. The work also extends to list recovery, binary Zyablov-bound decodings via concatenation, and capacity-achieving constructions over smaller alphabets using expander-based redistribution. Together, these results demonstrate that capacity-optimizing list decoding is achievable explicitly and efficiently over a range of alphabets, with practical guidance for parameter choices. The findings have broad implications for robust communication systems and for the design of highly redundant, efficient error-correcting codes.

Abstract

We present error-correcting codes that achieve the information-theoretically best possible trade-off between the rate and error-correction radius. Specifically, for every $0 < R < 1$ and $\eps> 0$, we present an explicit construction of error-correcting codes of rate $R$ that can be list decoded in polynomial time up to a fraction $(1-R-\eps)$ of {\em worst-case} errors. At least theoretically, this meets one of the central challenges in algorithmic coding theory. Our codes are simple to describe: they are {\em folded Reed-Solomon codes}, which are in fact {\em exactly} Reed-Solomon (RS) codes, but viewed as a code over a larger alphabet by careful bundling of codeword symbols. Given the ubiquity of RS codes, this is an appealing feature of our result, and in fact our methods directly yield better decoding algorithms for RS codes when errors occur in {\em phased bursts}. The alphabet size of these folded RS codes is polynomial in the block length. We are able to reduce this to a constant (depending on $\eps$) using ideas concerning ``list recovery'' and expander-based codes from \cite{GI-focs01,GI-ieeejl}. Concatenating the folded RS codes with suitable inner codes also gives us polynomial time constructible binary codes that can be efficiently list decoded up to the Zyablov bound, i.e., up to twice the radius achieved by the standard GMD decoding of concatenated codes.

Explicit Codes Achieving List Decoding Capacity: Error-correction with Optimal Redundancy

TL;DR

The paper presents explicit folded Reed-Solomon codes that achieve the list-decoding capacity up to an arbitrarily small loss ε for every rate R, with polynomial-time decoding. It introduces a multivariate interpolation framework (starting with trivariate and extending to s-variate) that enforces algebraic relations between a message polynomial f(X) and its shifted forms, enabling near-capacity error correction under adversarial noise. The work also extends to list recovery, binary Zyablov-bound decodings via concatenation, and capacity-achieving constructions over smaller alphabets using expander-based redistribution. Together, these results demonstrate that capacity-optimizing list decoding is achievable explicitly and efficiently over a range of alphabets, with practical guidance for parameter choices. The findings have broad implications for robust communication systems and for the design of highly redundant, efficient error-correcting codes.

Abstract

We present error-correcting codes that achieve the information-theoretically best possible trade-off between the rate and error-correction radius. Specifically, for every and , we present an explicit construction of error-correcting codes of rate that can be list decoded in polynomial time up to a fraction of {\em worst-case} errors. At least theoretically, this meets one of the central challenges in algorithmic coding theory. Our codes are simple to describe: they are {\em folded Reed-Solomon codes}, which are in fact {\em exactly} Reed-Solomon (RS) codes, but viewed as a code over a larger alphabet by careful bundling of codeword symbols. Given the ubiquity of RS codes, this is an appealing feature of our result, and in fact our methods directly yield better decoding algorithms for RS codes when errors occur in {\em phased bursts}. The alphabet size of these folded RS codes is polynomial in the block length. We are able to reduce this to a constant (depending on ) using ideas concerning ``list recovery'' and expander-based codes from \cite{GI-focs01,GI-ieeejl}. Concatenating the folded RS codes with suitable inner codes also gives us polynomial time constructible binary codes that can be efficiently list decoded up to the Zyablov bound, i.e., up to twice the radius achieved by the standard GMD decoding of concatenated codes.

Paper Structure

This paper contains 20 sections, 15 theorems, 21 equations, 6 figures.

Key Result

Lemma 3.1

Let $\{(\alpha_i,y_{i1},y_{i2})\}_{i=1}^{n_0}$ be an arbitrary set of $n_0$ triples from $\mathbb{F}^3$. Let $Q(X,Y_1,Y_2) \in \mathbb{F}[X,Y_1,Y_2]$ be a nonzero polynomial of $(1,k,k)$-weighted degree at most $D$ that has a zero of multiplicity $r$ at $(\alpha_i,y_{i1},y_{i2})$ for every $i$, $1 \

Figures (6)

  • Figure 1: Error-correction radius $\rho$ plotted against the rate $R$ of the code for known algorithms. The best possible trade-off, i.e., capacity, is $\rho=1-R$, and our work achieves this.
  • Figure 2: Error-correction radius $\rho$ of our algorithm for binary codes plotted against the rate $R$. The best possible trade-off, i.e., capacity, is $\rho=H^{-1}(1-R)$, and is also plotted.
  • Figure 3: Folding of the Reed Solomon code with parameter $m=4$.
  • Figure 4: The correspondence between a folded Reed-Solomon code (with $m=4$ and $x_i=\gamma^i$) and the Parvaresh Vardy code (of order $s=2$) evaluated over $\{1,\gamma,\gamma^2,\gamma^{4},\dots,\gamma^{n-4},\dots, \gamma^{n-2}\}$. The correspondence for the first block in the folded RS codeword and the first three blocks in the PV codeword is shown explicitly in the left corner of the figure.
  • Figure 5: Error-correction radius $\max(\rho_{{\rm b}}^{(m,2)}(R),\rho_{{\rm a}}^{(m,2)}(R))$ for $m=4,5$. For comparison $\rho_{{\rm GS}}(R)=1-\sqrt{R}$ and the limit $1-R^{2/3}$ are also plotted. For $m=5$, the performance of the trivariate interpolation algorithm strictly improves upon that of $\rho_{{\rm GS}}$ for all rates.
  • ...and 1 more figures

Theorems & Definitions (37)

  • Definition 2.1: Folded Reed-Solomon Code
  • Remark 2.1: Origins of term "folded RS codes"
  • Definition 3.1
  • Definition 3.2: Multiplicity of zeroes
  • Lemma 3.1
  • proof
  • Lemma 3.2
  • proof
  • Lemma 3.3
  • Theorem 3.4
  • ...and 27 more