Conditional Bayesian Quadrature
Zonghao Chen, Masha Naslidnyk, Arthur Gretton, François-Xavier Briol
TL;DR
This work tackles the costly estimation of conditional expectations $I(\theta)=\mathbb{E}_{X\sim\mathbb{P}_\theta}[f(X,\theta)]$ by introducing Conditional Bayesian Quadrature (CBQ), a two-stage hierarchical GP method that produces a Gaussian posterior on $I(\theta)$. In stage one, Bayesian quadrature yields $\hat{I}_{\text{BQ}}(\theta_t)$ with uncertainty $\sigma^2_{\text{BQ}}(\theta_t)$ for a set of parameters, which are then treated as noisy observations in a second-stage GP over $\Theta$ to obtain $\hat{I}_{\text{CBQ}}(\theta)$. The authors establish theoretical convergence rates, showing fast decay in $N$ and $T$, and demonstrate through experiments that CBQ outperforms traditional MC-based and regression-based baselines in Bayesian sensitivity analysis, SIR modeling, financial option pricing, and EVPPI calculations, while providing meaningful finite-sample uncertainty. Overall, CBQ offers a principled, data-efficient framework for expensive conditional expectations with practical impact across statistics, finance, and health economics. Future directions include active learning to adaptively choose $N$, $T$, and parameter locations, and extending CBQ to nested expectation problems.
Abstract
We propose a novel approach for estimating conditional or parametric expectations in the setting where obtaining samples or evaluating integrands is costly. Through the framework of probabilistic numerical methods (such as Bayesian quadrature), our novel approach allows to incorporates prior information about the integrands especially the prior smoothness knowledge about the integrands and the conditional expectation. As a result, our approach provides a way of quantifying uncertainty and leads to a fast convergence rate, which is confirmed both theoretically and empirically on challenging tasks in Bayesian sensitivity analysis, computational finance and decision making under uncertainty.
