To join this seminar virtually: Please request Zoom connection details from ea [at] stat.ubc.ca
Abstract: Generating samples from complex probability distributions is a fundamental challenge in statistical modelling. This problem is called "sampling" when we can access only an un-normalised density and "generative modelling" when we can only access a dataset of existing samples. In practice, this is generally impossible, and we must introduce a simpler reference distribution, such as a Gaussian, and manipulate its density and samples to approximate the target. In general, direct inference is reliable when the reference is close to the target and fragile when it is not. Annealing is a popular technique motivated by this principle and introduces a sequence of distributions that interpolates between the reference and target, ensuring the neighbouring distributions are close. An annealing algorithm specifies how to traverse this bridge of distributions to incrementally transform samples from the reference into samples approximating the target.
In this talk, I will discuss how annealing can be applied to sampling and generative modelling. As a case study, I will introduce parallel tempering (PT) and denoising diffusion models (DDM), two annealing algorithms recently gaining popularity in the sampling and generative modelling literature. I will show how our analysis of PT and DDM can provide insights into understanding the design choices behind their success and motivate the design choices for other annealing algorithms in sampling and generative modelling. Finally, I will discuss the applications of this body of work to Bayesian inference and PDE surrogate modelling.