Score-based diffusion models have emerged as a powerful framework for generative modeling, progressively transforming noise into structured data samples when access to a dataset is available. In this talk, we explore how to extend these ideas to the sampling setting, where the target distribution is only known up to a normalizing constant. After reviewing the fundamentals of diffusion models, we highlight a key observation: the score function central to these methods can be expressed as an expectation with respect to a time-dependent distribution with known unnormalized density. This perspective motivates Stochastic Localization via Iterative Posterior Sampling (SLIPS), an approach that estimates the score function using Monte Carlo methods and leverages it to construct a denoising process. We will examine the theoretical underpinnings of SLIPS, with particular emphasis on its main limitation, the duality of log-concavity, which restricts its practical applicability. Building on this, I will present a new approach to Iterative Posterior Sampling (forthcoming work) that bypasses explicit score estimation altogether, leading to significantly improved scalability. While this method remains affected by the same duality phenomenon, we will see that its impact is mitigated in practice.
To join this seminar virtually, please request Zoom connection details from hr.ops@stat.ubc.ca.
Speaker's page: https://h2o64.github.io/
Location: ESB 4192 / Zoom
Event date: -
Speaker: Louis Grenioux, Research Fellow, Center for Computational Mathematics, Flatiron Institute