Table of Contents
Fetching ...

Efficient and Fast Sampling from Arbitrary Probability Kernels using Sliced Gibbs Sampler

Prithwish Ghosh, Sujit K Ghosh

Abstract

An Automated Sliced Gibbs framework is proposed for fully automated Markov chain Monte Carlo sampling from arbitrary finite dimensional probability kernels. The method targets unnormalized, non-smooth, heavy tailed, and highly multimodal densities. A Cauchy transformation based effective support estimator is combined with slice driven Gibbs updates. This construction removes the need for user specified truncation bounds, proposal scales, step-size tuning, or conditional optimization. Unlike existing slice samplers, ASG does not require manually chosen bracket widths or geometric insight into the support. All calibration is performed automatically within each Gibbs cycle. The resulting Markov chain preserves invariance and ergodicity. Automated support detection allows efficient movement across disconnected high density regions. The sampler adapts to sharp curvature and irregular geometry without gradient information. Extensive numerical experiments evaluate performance on complex kernels, including univariate Beta mixtures, multivariate Rosenbrock and Ackley benchmarks, and non-smooth kernels derived from generalized LASSO type loss functions. Across these challenging settings, ASG consistently achieves higher effective sample size per second and faster decorrelation than Random Walk Metropolis Hastings, adaptive Gibbs variants, and some recently proposed slice based methods. The framework provides a scalable and general-purpose solution for sampling from complicated probability kernels where existing algorithms require substantial tuning or exhibit slow mixing.

Efficient and Fast Sampling from Arbitrary Probability Kernels using Sliced Gibbs Sampler

Abstract

An Automated Sliced Gibbs framework is proposed for fully automated Markov chain Monte Carlo sampling from arbitrary finite dimensional probability kernels. The method targets unnormalized, non-smooth, heavy tailed, and highly multimodal densities. A Cauchy transformation based effective support estimator is combined with slice driven Gibbs updates. This construction removes the need for user specified truncation bounds, proposal scales, step-size tuning, or conditional optimization. Unlike existing slice samplers, ASG does not require manually chosen bracket widths or geometric insight into the support. All calibration is performed automatically within each Gibbs cycle. The resulting Markov chain preserves invariance and ergodicity. Automated support detection allows efficient movement across disconnected high density regions. The sampler adapts to sharp curvature and irregular geometry without gradient information. Extensive numerical experiments evaluate performance on complex kernels, including univariate Beta mixtures, multivariate Rosenbrock and Ackley benchmarks, and non-smooth kernels derived from generalized LASSO type loss functions. Across these challenging settings, ASG consistently achieves higher effective sample size per second and faster decorrelation than Random Walk Metropolis Hastings, adaptive Gibbs variants, and some recently proposed slice based methods. The framework provides a scalable and general-purpose solution for sampling from complicated probability kernels where existing algorithms require substantial tuning or exhibit slow mixing.

Paper Structure

This paper contains 29 sections, 2 theorems, 45 equations, 24 figures, 12 tables, 2 algorithms.

Key Result

Theorem 3.1

Consider the SG sampler given in (eq:sgs) based on any arbitrary target kernel given in (eq: K(x)). Then the corresponding transition kernel given by (eq:trans) of produces a Markov Chain with stationary kernel given by (eq:stationary). $\blacktriangleleft$$\blacktriangleleft$

Figures (24)

  • Figure 1: Comparison of proposed ASG vs. RW-MH for the univariate mixture density (see Section \ref{['univariate']} for details). Notice that no samples are generated from the third component of the mixture density by the RW-MH in the figure (b).
  • Figure 2: First 10 steps of the ASG algorithm based on the effective support over the beta kernel \ref{['eq:k1']}
  • Figure 3: Samples overlayed on contour Plot of Rosenbrock Kernel
  • Figure 4: Contour plot of the Rosenbrock kernel with effective-support bounds. The vertical blue lines represent the estimated conditional ranges of $X_2 \mid X_1 = c$, while the horizontal red lines show the bounds of $X_1 \mid X_2 = c$. The contours illustrate the banana-shaped density curvature, and the bounds dynamically adjust to cover the effective support depending on the fixed variable.
  • Figure 5: Contour Plot of Ackley Function where for both figures the red dots are the samples generated by ASG((a)) and RW-MH((b)) overlayed on the contour plot.
  • ...and 19 more figures

Theorems & Definitions (6)

  • Theorem 3.1: Stationarity of SG sampler
  • proof
  • Remark 3.1
  • Theorem 3.2: Uniform ergodicity of ASG sampler
  • proof
  • Remark 3.2