Stat 547Q : Statistical Modeling with Stochastic Processes
Kingman's coalescent emerging from a Wright-Fisher model. Note also the connection with perfect sampling.
- Instructor: Alexandre Bouchard-Côté
- Check latest updates: [HTML][RSS]
- Dates: February 28 - April 6, 2011
- Time: MW 13:30 - 15:00
- Place: LSK 301
- Don't hesistate to contact me if you have questions: bouchard AT stat.ubc.ca
- Next office hour: Wednesday, 3/30, 3:00-4:00, at LSK 330
- Template for scribing: [zip]
- Final project guidelines: [pdf]
- Part 1: Background. See slides 1-5
- Part 2: Basics of Dirichlet processes. [pdf]
- Part 3: Applications of Dirichlet processes. [pdf]
- Part 3: Beyond Dirichlet processes (Beta version/some parts are not yet reviewed; protected by the same username and password as the assignments). [pdf]
- Lecture 1: Overview; motivating examples and applications. [pdf]
- Lecture 2: Overview of computational issues. Background: exact inference. [pdf]
- Lecture 3: Background: MCMC. [pdf]
- Lecture 4: More MCMC, background for variational methods. [pdf]
- Lecture 5: Wrapping up variational methods; theoretical foundations of Dirichlet Processes. [pdf]
- Lecture 6: Theoretical foundations of Dirichlet Processes (DPs), important properties of DPs. [pdf]
- Lecture 7: More on Polya urns, posterior inference. [pdf]
- Lecture 8: Applications of DPs. [pdf]
- Lecture 9: NLP applications. Hierarchical DP. Intro to Pitman-Yor.[pdf]
- Lecture 10: More on hierarchical models and PY. Infinite HMM, Beta process.[pdf]
- Lecture 11: Beta, Gamma, and Poisson processes. Campbell's theorem and its application for Beta, Gamma, and Dirichlet process constructions.[pdf]
- Lecture 12: Sequence memoizer, Dependent Dirichlet process, CTMCs. Intro to phylogenetics.[pdf]
Stochastic processes are powerful tools for constructing the rich models needed to capture the complexity of our world. Motivated by problems in machine learning and phylogenetics, we will discuss an array of concrete examples where stochastic processes are used to perform sophisticated statistical inferences. After going through these examples, you will be familar with the main building blocks, you will know how to compose them to create new models, you will be able to design inference algorithms for your models and you will have a better understanding of the limits of these models and algorithms.
I will not assume previous exposure to stochastic processes. Probabilists interested in applications of their work are encouraged to participate: in contrast to MATH 303 and MATH 419, the focus will be on statistical and computational issues.
List of topics:
- Motivations: machine learning, Bayesian statistics, de Finetti's theorem. Applications in computational biology, natural language processing, classification, etc.
- Basics: Kolmogorov consistency theorem, Levy processes, completely random measures, introduction to/review of Markov Chain Monte Carlo (MCMC) and variational methods
- Random measures: Dirichlet, Pitman-Yor and Beta processes. Mixture models. Representations and computations: stick-breaking processes, Pólya urn, and Chinese restaurants. Slice and collapsed samplers. If time permits: variational approaches; slit-merge, auxiliary variables and hyper-parameters sampling techniques. Extensions: hierarchies, sequences, cross-products, nesting and dependence. Applications to natural language modeling, multi-target tracking, machine translation, clustering and time series. Consistency. Robins-Ritov paradox.
- Random combinatorial objects: paths and trees. Domains of application covered: phylogenetics and agglomerative clustering. Distribution over trees. Computing posteriors over unrooted trees using MCMC. Branching processes, estimation of ultimate extinction probabilities and consequences in modeling.
Continuous Time Markov Chains: Kolmogorov forward/backward equations, application to modeling DNA mutations and typological change. If time permits: Thorne-Kishino-Felsenstein birth-death models. Estimation of rate parameters using eigenvalue decomposition. Optional: Rao-Blackwellization of samplers using the sum-product algorithm. Kingman's coalescent and Brownian motion: applications to population genetics. Wright-Fisher model and its limiting processes. If time permits: computing posteriors over ultrametric trees using Sequential Monte Carlo. Relation to the Chinese restaurant process and Ewens's sampling formula.
- Overview of other processes and applications. Depending on time and interest, a selection from: Poisson processes for regression; Cox processes for dynamic social networks; Gaussian processes for classification; gamma processes for object recognition; diffusion processes for density estimation.
Evaluation: assignments (best 2 of 3), scribing for at least one lecture, and a final project. If you choose to do all 3 projects, you also have the option of substituting the final project by a literature review.
- See slides of lecture 5 above for the latest list.
- Information Theory, Inference, and Learning Algorithms. (2003) D. MacKay. Part IV [free ebook]
- Graphical models, exponential families, and variational inference. (2008) M. J. Wainwright and M. I. Jordan. Chapters 1-5 [pdf]
- Dirichlet processes, Chinese restaurant processes and all that. M. I. Jordan. (2005) [ps]