Taking Bigger Metropolis Steps by Dragging Fast Variables
Radford M. Neal
TL;DR
The paper addresses inefficiencies in Metropolis-Hastings when a clear fast-slow variable decomposition exists. It introduces a dragging strategy that updates the slow x together with intermediate updates to the fast y, preserving detailed balance and approximating marginal sampling for x. Two variants are developed: a single intermediate transition and a multi-step ladder of intermediates; in the limit of many steps, the method nears the efficiency of marginal Metropolis while keeping inner computations fast. Empirical results on controlled distributions demonstrate substantial gains in sampling efficiency for slow variables, with robustness to additional fast variables and connections to tempered transitions as a generalization.
Abstract
I show how Markov chain sampling with the Metropolis-Hastings algorithm can be modified so as to take bigger steps when the distribution being sampled from has the characteristic that its density can be quickly recomputed for a new point if this point differs from a previous point only with respect to a subset of 'fast' variables. I show empirically that when using this method, the efficiency of sampling for the remaining 'slow' variables can approach what would be possible using Metropolis updates based on the marginal distribution for the slow variables.
