Table of Contents
Fetching ...

Quantum Algorithms for Gibbs Expectation of Non-log-concave and Heavy-tailed Distributions

Xinmiao Li, Jin-Peng Liu

Abstract

We establish a systematic framework of unbiased quantum sampling and estimation protocols for the classical Gibbs expectation. This framework generalizes existing approaches to the partition function estimation and has broader applications in various fields. We consider sampling and estimation for a wide class of non-log-concave distributions, particularly heavy-tailed ones, under relaxed assumptions beyond strong convexity, such as dissipativity. We develop an unbiased extension of quantum-accelerated multilevel Monte Carlo (QA-MLMC) to eliminate all biases from discretization and time truncation, together with introducing a change-of-measure approach and the Girsanov theorem via Radon-Nikodym derivatives. As a result, our approach achieves quantum complexity $\widetilde{\mathcal{O}}(ε^{-1})$ within error $ε$, whereas the classical MLMC requires $\widetilde{\mathcal{O}}(ε^{-2})$ and existing quantum algorithms yield biased estimators under stronger assumptions. Furthermore, our unified framework enables unbiased quantum sampling and estimation for certain heavy-tailed distributions after transformation. We provide several concrete applications of our approach in statistics, machine learning, and finance, towards more practical scenarios of the quantum acceleration of stochastic processes.

Quantum Algorithms for Gibbs Expectation of Non-log-concave and Heavy-tailed Distributions

Abstract

We establish a systematic framework of unbiased quantum sampling and estimation protocols for the classical Gibbs expectation. This framework generalizes existing approaches to the partition function estimation and has broader applications in various fields. We consider sampling and estimation for a wide class of non-log-concave distributions, particularly heavy-tailed ones, under relaxed assumptions beyond strong convexity, such as dissipativity. We develop an unbiased extension of quantum-accelerated multilevel Monte Carlo (QA-MLMC) to eliminate all biases from discretization and time truncation, together with introducing a change-of-measure approach and the Girsanov theorem via Radon-Nikodym derivatives. As a result, our approach achieves quantum complexity within error , whereas the classical MLMC requires and existing quantum algorithms yield biased estimators under stronger assumptions. Furthermore, our unified framework enables unbiased quantum sampling and estimation for certain heavy-tailed distributions after transformation. We provide several concrete applications of our approach in statistics, machine learning, and finance, towards more practical scenarios of the quantum acceleration of stochastic processes.

Paper Structure

This paper contains 42 sections, 61 theorems, 446 equations, 3 figures, 1 table, 4 algorithms.

Key Result

Lemma 1

Assume that $f$ is dissipativity and $L$-smooth, then the Langevin diffusion SDE0 is geometrically ergodic with invariant measure $\pi \propto e^{-f(x)}$. $\blacktriangleleft$$\blacktriangleleft$

Figures (3)

  • Figure 4: Extension of Applicable Regimes in This Work
  • Figure 5: Relationships among the above several definitions
  • Figure 6: Histogram of samples generated by the spring–coupled Langevin sampler

Theorems & Definitions (124)

  • Remark 3
  • Definition 1: $L$-Smoothness
  • Definition 2: $L$-Hessian-Smoothness
  • Definition 3: One-sided Lipschitz / Strong convexity
  • Definition 4: Weaker One-sided Lipschitz
  • Definition 5: Dissipativity
  • Lemma 1: Ergodicity under Dissipativity
  • Lemma 2: Powering Lemma Jerrum1986RandomGO
  • Theorem 1: Unbiased Quantum Mean Estimation Sidford2023QuantumSF
  • Theorem 2: Multilevel Monte Carlo
  • ...and 114 more