Empirical Bayes for Dynamic Bayesian Networks Using Generalized Variational Inference
Vyacheslav Kungurtsev, Apaar, Aarya Khandelwal, Parth Sandeep Rastogi, Bapi Chatterjee, Jakub Mareček
TL;DR
The paper addresses uncertainty quantification in learning Dynamic Bayesian Networks under limited data by combining Empirical Bayes with Generalized Variational Inference. It first obtains a set of frequentist IP-based point estimates of structure and weights, then constructs a data-driven prior and a mixture model over DAG structures via Coherent Generalized Variational Inference (CGVI) using Rényi divergence. Each model is treated with its own CGVI problem, yielding a Gibbs-like posterior for the parameters and enabling parallel sampling and mixture-weight optimization; this provides uncertainty quantification for non-asymptotic sample sizes while avoiding full DBN representation sampling. Preliminary numerical results on a train-validation split illustrate how the approach samples from a mixture of DAG structures and parameter posteriors, offering practical insights into causal structure uncertainty in small-sample regimes. The methodology potentially improves robustness and interpretability in dynamic network learning by integrating frequentist structure estimation with Bayesian model averaging through a principled, divergence-constrained posterior framework.
Abstract
In this work, we demonstrate the Empirical Bayes approach to learning a Dynamic Bayesian Network. By starting with several point estimates of structure and weights, we can use a data-driven prior to subsequently obtain a model to quantify uncertainty. This approach uses a recent development of Generalized Variational Inference, and indicates the potential of sampling the uncertainty of a mixture of DAG structures as well as a parameter posterior.
