To Join via Zoom: To join this seminar virtually, please request Zoom connection details from headsec [at] stat.ubc.ca
Abstract: I introduce a method for approximate marginal likelihood inference via adaptive Gaussian quadrature in mixed models with a single grouping factor. The core technical contributions are (a) an algorithm for computing the exact gradient of the approximate log marginal likelihood and (b) a useful parameterization of the multivariate Gaussian. The former leads to efficient quasi-Newton optimization of the marginal likelihood that is several times faster than established methods; the latter gives Wald confidence intervals for random effects variances that attain nominal coverage and low bias if enough quadrature points are used. The Laplace approximation is a special case of the method and is shown in simulations to perform exceptionally poorly for binary random slopes models, but this is mitigated by just adding more quadrature points.