The Metropolis-Hastings algorithm is often used to obtain Markov Chian Monte Carlo samples from highly complex posterior distributions, such as those involved in full inference of hyper-parameters in Gaussian Process models. Here we compare one new and three existing implementations of the Metropolis-Hastings algorithm in the context of sampling from the posterior distributions of hyper-parameters in a Gaussian Process. The implementations are compared in terms of their initialization biases and convergence rates, as well as in terms of their performance on higher dimensional data. Our experiments involve sampling from GP posterior distributions using the four different implementations and comparing the quality of these samples. A discrepancy measure is devised based on the Kolmogorov-Smirnov test to measure the convergence rate of each algorithm. Issues in generating MCMC samples for high dimensional data are discussed and an optimal approach to sampling is proposed called the Laplace approach. All of the algorithms in this paper are implemented in an R package called ‘gpMCMC’.
Optimizing the Metropolis Hastings algorithm for Gaussian Processes
Tuesday, May 10, 2016 - 11:00
Gal Av-Gay, MSc. student, UBC Statistics
Room 4192, Earth Sciences Building, 2207 Main Mall