Variational inference (VI) approximates a target distribution within a chosen family that permits i.i.d. sampling and tractable density evaluation. Because the approximation is obtained by minimizing a divergence to the target, its best achievable quality is constrained by the family’s expressiveness. Yet greater flexibility does not guarantee better results: the optimization landscape is typically highly non-convex, so the theoretical optimum is rarely attained in practice. Consequently, VI generally lacks the asymptotic exactness of Markov chain Monte Carlo (MCMC)—the ability to achieve arbitrarily accurate inference given sufficient computation, regardless of tuning.
In this talk, I will introduce mixed variational flows (MixFlows): a framework for constructing tuning-free, asymptotically exact variational families using measure-preserving dynamical systems. The key methodological advance is a way to use involutive MCMC kernels to build variational flows, yielding families that inherit MCMC-level convergence guarantees while retaining VI’s tractability (i.i.d. sampling and closed-form density evaluation).
I will also discuss how tools from chaotic dynamical systems illuminate the propagation of probabilistic error through \emph{inexact} flows—errors that arise from finite-precision arithmetic and numerical discretization—providing practical guidance for when flow-based approximations remain reliable in spite of numerical instability.
To join this seminar virtually, please request Zoom connection details from ea@stat.ubc.ca.
Speaker's page: https://zuhengxu.github.io/
Location: ESB 4192 / Zoom
Event date: -
Speaker: Zuheng David Xu, UBC Statistics Ph.D. student