News & Events

Subscribe to email list

Please select the email list(s) to which you wish to subscribe.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Image CAPTCHA

Enter the characters shown in the image.

User menu

You are here

Ensembles in the Age of Overparameterization: Promises and Pathologies

Wednesday, October 23, 2024 - 11:00 to 12:00
Geoff Pleiss, UBC Statistics Assistant Professor
Statistics Seminar
ICCS X836 / Zoom
To join this seminar virtually: please click here.
Abstract: Ensemble methods have historically used either high-bias base learners (e.g. through boosting) or high-variance base learners (e.g. through bagging). Modern neural networks cannot be understood through this classic bias-variance tradeoff, yet "deep ensembles" are pervasive in safety-critical and high-uncertainty application domains. This talk will cover surprising and counterintuitive phenomena that emerge when ensembling overparameterized base models like neural networks. While deep ensembles improve generalization in a simple and cost-effective manner, their accuracy and robustness are often outperformed by single (but larger) models. Furthermore, discouraging diversity amongst component models often improves the ensemble's predictive performance, counter to classic intuitions underpinning bagging and feature subsetting techniques. I will connect these empirical findings with new theoretical characterizations of overparameterized ensembles, and I will conclude with implications for uncertainty quantification, robustness, and decision making.