Table of Contents
Fetching ...

Conditional Bayesian Quadrature

Zonghao Chen, Masha Naslidnyk, Arthur Gretton, François-Xavier Briol

TL;DR

This work tackles the costly estimation of conditional expectations $I(\theta)=\mathbb{E}_{X\sim\mathbb{P}_\theta}[f(X,\theta)]$ by introducing Conditional Bayesian Quadrature (CBQ), a two-stage hierarchical GP method that produces a Gaussian posterior on $I(\theta)$. In stage one, Bayesian quadrature yields $\hat{I}_{\text{BQ}}(\theta_t)$ with uncertainty $\sigma^2_{\text{BQ}}(\theta_t)$ for a set of parameters, which are then treated as noisy observations in a second-stage GP over $\Theta$ to obtain $\hat{I}_{\text{CBQ}}(\theta)$. The authors establish theoretical convergence rates, showing fast decay in $N$ and $T$, and demonstrate through experiments that CBQ outperforms traditional MC-based and regression-based baselines in Bayesian sensitivity analysis, SIR modeling, financial option pricing, and EVPPI calculations, while providing meaningful finite-sample uncertainty. Overall, CBQ offers a principled, data-efficient framework for expensive conditional expectations with practical impact across statistics, finance, and health economics. Future directions include active learning to adaptively choose $N$, $T$, and parameter locations, and extending CBQ to nested expectation problems.

Abstract

We propose a novel approach for estimating conditional or parametric expectations in the setting where obtaining samples or evaluating integrands is costly. Through the framework of probabilistic numerical methods (such as Bayesian quadrature), our novel approach allows to incorporates prior information about the integrands especially the prior smoothness knowledge about the integrands and the conditional expectation. As a result, our approach provides a way of quantifying uncertainty and leads to a fast convergence rate, which is confirmed both theoretically and empirically on challenging tasks in Bayesian sensitivity analysis, computational finance and decision making under uncertainty.

Conditional Bayesian Quadrature

TL;DR

This work tackles the costly estimation of conditional expectations by introducing Conditional Bayesian Quadrature (CBQ), a two-stage hierarchical GP method that produces a Gaussian posterior on . In stage one, Bayesian quadrature yields with uncertainty for a set of parameters, which are then treated as noisy observations in a second-stage GP over to obtain . The authors establish theoretical convergence rates, showing fast decay in and , and demonstrate through experiments that CBQ outperforms traditional MC-based and regression-based baselines in Bayesian sensitivity analysis, SIR modeling, financial option pricing, and EVPPI calculations, while providing meaningful finite-sample uncertainty. Overall, CBQ offers a principled, data-efficient framework for expensive conditional expectations with practical impact across statistics, finance, and health economics. Future directions include active learning to adaptively choose , , and parameter locations, and extending CBQ to nested expectation problems.

Abstract

We propose a novel approach for estimating conditional or parametric expectations in the setting where obtaining samples or evaluating integrands is costly. Through the framework of probabilistic numerical methods (such as Bayesian quadrature), our novel approach allows to incorporates prior information about the integrands especially the prior smoothness knowledge about the integrands and the conditional expectation. As a result, our approach provides a way of quantifying uncertainty and leads to a fast convergence rate, which is confirmed both theoretically and empirically on challenging tasks in Bayesian sensitivity analysis, computational finance and decision making under uncertainty.

Paper Structure

This paper contains 53 sections, 10 theorems, 74 equations, 14 figures, 1 table.

Key Result

Theorem 1

Let $x \mapsto f(x, \theta)$ be a function of smoothness $s_f > d/2$, and $\theta \mapsto f(x, \theta)$ be a function of smoothness $s_I > p/2$ such that $\sup_{\theta \in \Theta} \max_{|\alpha|<s_I} \| D_\theta^\alpha f(\cdot, \theta) \|_{\mathcal{W}^{s_I, 2}(\mathcal{X})}<\infty$. Suppose the foll Then, we have that for any $\delta \in (0, 1)$ there is an $N_0>0$ such that for any $N \geq N_0$ w

Figures (14)

  • Figure 1: Illustration for conditional Bayesian quadrature (CBQ) in \ref{['sec:cbq']}. The first stage gives a GP posterior of $f(x,\theta)$ for each $\theta \in \{\theta_1, \cdots, \theta_T\}$, which are then integrated to give $\hat{I}_{\text{BQ}}(\theta_1), \cdots, \hat{I}_{\text{BQ}}(\theta_T)$. The second stage then combines all BQ estimates from the first stage to give a GP posterior of $I(\theta)$: $\hat{I}_{\text{CBQ}}(\theta)$. All shared areas represent Bayesian quantification of uncertainty.
  • Figure 2: Illustration of CBQ.Left: Directed acyclic graph representation. Circle nodes indicate random variables and rectangles correspond to independent replications over indices. Right: BQ and CBQ posteriors on $I(\theta_{1:2})=[I(\theta_1), I(\theta_2)]^\top$ for $\theta_1 \approx \theta_2$. Unlike BQ, the CBQ posterior accounts for the relation between the two quantities.
  • Figure 3: Bayesian sensitivity analysis for linear models.Left: RMSE of all methods when $d=2$ and $N=50$. Middle: RMSE of all methods when $d=2$ and $T=50$. Right: RMSE of all methods when $N=T=100$.
  • Figure 4: Bayesian linear model sensitivity analysis in $d=2$.
  • Figure 5: Bayesian sensitivity analysis for SIR Model $\&$ Option pricing in mathematical finance.Left: RMSE of all methods for the SIR example with $T=15$. Middle: The computational cost (in wall clock time) for CBQ ($T=15, N=40$) and for obtaining one single numerical solution from SIR under different discretization step sizes. In practice, the process of obtaining samples from SIR equations is repeated $NT$ times. Right: RMSE of all methods for the finance example with $T=20$.
  • ...and 9 more figures

Theorems & Definitions (21)

  • Theorem 1
  • Theorem 2: Generalised \ref{['thm:convergence']}
  • Theorem 3
  • proof
  • Proposition 1
  • proof
  • Theorem 4
  • proof
  • Lemma 1: Modified Lemma 18 in gogolashvili2023importance
  • proof
  • ...and 11 more