Table of Contents
Fetching ...

Contraction of Private Quantum Channels and Private Quantum Hypothesis Testing

Theshani Nuradha, Mark M. Wilde

TL;DR

This work analyzes how quantum divergences contract under privacy constraints implemented via quantum local differential privacy (QLDP). It develops a practical QLDP mechanism combining a measurement with a depolarizing step and derives tight privatized contraction bounds for the hockey-stick divergence and the trace distance, with explicit, dimension-free expressions such as $\eta_T^\varepsilon = (e^\varepsilon-1)/(e^\varepsilon+1)$. The authors extend these results to the Bures distance and quantum relative entropy relative to the normalized trace distance, and apply them to non-asymptotic private quantum hypothesis testing, obtaining both general and instance-specific sample-complexity bounds, including sharp results for orthogonal states and low-privacy regimes. They also explore downstream applications to quantum fairness and Holevo information stability, illustrating how private channels can preserve stability and generalization-like properties in quantum learning settings.

Abstract

A quantum generalized divergence by definition satisfies the data-processing inequality; as such, the relative decrease in such a divergence under the action of a quantum channel is at most one. This relative decrease is formally known as the contraction coefficient of the channel and the divergence. Interestingly, there exist combinations of channels and divergences for which the contraction coefficient is strictly less than one. Furthermore, understanding the contraction coefficient is fundamental for the study of statistical tasks under privacy constraints. To this end, here we establish upper bounds on contraction coefficients for the hockey-stick divergence under privacy constraints, where privacy is quantified with respect to the quantum local differential privacy (QLDP) framework, and we fully characterize the contraction coefficient for the trace distance under privacy constraints. With the machinery developed, we also determine an upper bound on the contraction of both the Bures distance and quantum relative entropy relative to the normalized trace distance, under QLDP constraints. Next, we apply our findings to establish bounds on the sample complexity of quantum hypothesis testing under privacy constraints. Furthermore, we study various scenarios in which the sample complexity bounds are tight, while providing order-optimal quantum channels that achieve those bounds. Lastly, we show how private quantum channels provide fairness and Holevo information stability in quantum learning settings.

Contraction of Private Quantum Channels and Private Quantum Hypothesis Testing

TL;DR

This work analyzes how quantum divergences contract under privacy constraints implemented via quantum local differential privacy (QLDP). It develops a practical QLDP mechanism combining a measurement with a depolarizing step and derives tight privatized contraction bounds for the hockey-stick divergence and the trace distance, with explicit, dimension-free expressions such as . The authors extend these results to the Bures distance and quantum relative entropy relative to the normalized trace distance, and apply them to non-asymptotic private quantum hypothesis testing, obtaining both general and instance-specific sample-complexity bounds, including sharp results for orthogonal states and low-privacy regimes. They also explore downstream applications to quantum fairness and Holevo information stability, illustrating how private channels can preserve stability and generalization-like properties in quantum learning settings.

Abstract

A quantum generalized divergence by definition satisfies the data-processing inequality; as such, the relative decrease in such a divergence under the action of a quantum channel is at most one. This relative decrease is formally known as the contraction coefficient of the channel and the divergence. Interestingly, there exist combinations of channels and divergences for which the contraction coefficient is strictly less than one. Furthermore, understanding the contraction coefficient is fundamental for the study of statistical tasks under privacy constraints. To this end, here we establish upper bounds on contraction coefficients for the hockey-stick divergence under privacy constraints, where privacy is quantified with respect to the quantum local differential privacy (QLDP) framework, and we fully characterize the contraction coefficient for the trace distance under privacy constraints. With the machinery developed, we also determine an upper bound on the contraction of both the Bures distance and quantum relative entropy relative to the normalized trace distance, under QLDP constraints. Next, we apply our findings to establish bounds on the sample complexity of quantum hypothesis testing under privacy constraints. Furthermore, we study various scenarios in which the sample complexity bounds are tight, while providing order-optimal quantum channels that achieve those bounds. Lastly, we show how private quantum channels provide fairness and Holevo information stability in quantum learning settings.

Paper Structure

This paper contains 25 sections, 24 theorems, 245 equations.

Key Result

Proposition 1

For all $\varepsilon \geq 0$ and $p\in[0,1]$, the mechanism $\mathcal{A}^p_{\mathrm{Dep}} \circ \mathcal{M}$ satisfies $\varepsilon$-QLDP if where, for a fixed measurement operator $M$ (satisfying $0 \leq M \leq I$) and for an input state $\omega$, the measurement channel $\mathcal{M}$ is defined as

Theorems & Definitions (42)

  • Definition 1: Classical Local Differential Privacy
  • Definition 2: Quantum Local Differential Privacy
  • Remark 1: Connection to Hockey-stick Divergence
  • Remark 2: Connection to Max-Relative Entropy and Datta--Leditzky Divergence
  • Proposition 1: QLDP Mechanism
  • Theorem 1: Privatized Contraction Coefficient of Hockey-Stick Divergence
  • Lemma 1
  • Corollary 1: Contraction of Hockey-Stick Divergence under QLDP
  • Theorem 2: Privatized Contraction Coefficient of Trace Distance
  • Proposition 2: Contraction of Bures Distance under $\varepsilon$-QLDP
  • ...and 32 more