To join this seminar virtually: Please request Zoom connection details from ea [at] stat.ubc.ca.
Abstract: Computationally expensive integration problems are ubiquitous across statistics and machine learning. This creates a need for methods that approximate integrals well with as few samples as possible. Bayesian quadrature is a probabilistic integration method in which a Gaussian process prior is placed on the integrand, allowing information about properties of the integrand – such as smoothness – to be used for improved sample efficiency. I will discuss two projects where we used Bayesian quadrature to create better estimators: (1) an improved estimator for maximum mean discrepancy when the measure is a pushforward, and (2) estimators for conditional expectation. In addition, I'll discuss how the choice of prior kernel affects the quality of uncertainty quantification in Gaussian process interpolation (and consequently, Bayesian quadrature), and present a comparison of maximum likelihood and cross-validation estimators that shows that cross-validation is more robust to smoothness misspecification.