Variational inference with stochastic gradients, commonly called black-box variational inference (BBVI) or stochastic gradient variational inference, is the workhorse of probabilistic inference in the large data, large model regime. For a decade, however, the computational properties of VI have largely been unknown. For instance, under what conditions is BBVI guaranteed to converge, and is it provably efficient? In this talk, I will present recent theoretical results on VI in the form of quantitative non-asymptotic convergence guarantees for obtaining a variational posterior. Following this, I will demonstrate the usefulness of the theoretical framework by investigating the theoretical properties of various design choices and algorithmic modifications, such as parametrizations of variational approximation, variance-reduced gradient estimators such as sticking-the-landing, structured variational families, and beyond.
To join this seminar virtually, please request Zoom connection details from ea@stat.ubc.ca.
Speaker's page: Kyurae Kim
Location: ESB 4192 / Zoom
Event date: -
Speaker: Kyurae Kim, Ph.D. student, Computer and Information Sciences, University of Pennsylvania