To Join Via Zoom: To join this seminar virtually, please request Zoom connection details from headsec [at] stat.ubc.ca.
Abstract: Boosting is a highly flexible and powerful approach when it comes to making predictions in non-parametric settings. In spite of the popularity and practical success of boosting algorithms, there is a lack of focus on its generalizations to “complex data”, such as data with outliers or functional variables. For data contaminated with outliers, we propose a two-stage boosting algorithm similar to what is done for robust linear MM-regression: it first minimizes a robust residual scale estimator, and then improves it by optimizing a bounded loss function. For data containing functional predictors, we propose a tree-based boosting algorithm that uses “base-learners” constructed with multiple projections. Our proposal incorporates possible interactions between indices, making it capable of approximating complex regression functions. Finally, we extend our proposals to contaminated functional data and explore two variations that can be used to perform robust functional regression.