News & Events

Subscribe to email list

Please select the email list(s) to which you wish to subscribe.

User menu

You are here

Toward large-scale Bayesian inference with structured generative models

Friday, January 26, 2024 - 12:15 to 13:15
Ricardo Baptista, von Karman Instructor, Computing and Mathematical Sciences, California Institute of Technology
Statistics Seminar
ESB 4192 / Zoom

To join this seminar virtually: Please request Zoom connection details from ea [at] stat.ubc.ca

Abstract: Conditional generative models are an increasingly popular machine learning approach for performing Bayesian Inference in large-scale scientific applications. Despite their success, the sample-complexity of these methods often scale poorly with the growing dimensions of model parameters and observations. These models require large training datasets, which are typically unavailable with computationally expensive forward models as in numerical weather prediction. In these settings, it becomes crucial to identify the low-dimensional structure in posterior distributions and encode it in generative models for accurate inference. In this presentation, I will introduce an information-theoretic perspective for reducing the dimensions of parameters and observations in inverse problems with guarantees on the posterior approximation error. I will show how to identify relevant subspaces for these variables from either gradient evaluations of the log-likelihood function or by using score-based generative models when gradients are not available. The benefit of the dimension reduction approach will be showcased on a turbulent flow estimation problem from aerodynamics where traditional methods are unstable in small sample regimes.