Alexandre Bouchard-Côté

- MAP estimators (maximum a posteriori)
- posterior means
- Bayes rule
- models where some unknown quantities are treated as random
- none of the above

All these popular answers are misleading and/or very incomplete:

- MAP estimators (maximum a posteriori)
- MAP is seldom used by expert Bayesians (mode is misleading in high dimensions)

- posterior means
- the posterior mean is often undefined (e.g. Bayesian analysis over combinatorial objects)

- Bayes rule
- Bayes rule is intractable in most practical situations (we use MCMC/variational methods)

- models where some unknown quantities are treated as random
- true for Bayesian models, but also for many non-Bayesian models, e.g., random effect models

**Bayesian Analysis:** statistical discipline centered around the use of **Bayes estimators**

**Bayes estimators:** for data \(X\), unobserved \(Z\), loss \(L\), and possible actions \({\mathcal{A}}\), the Bayes estimator is defined as:

\[{\textrm{argmin}}\{ {\mathbf{E}}[L(a, Z) | X] : a \in {\mathcal{A}}\}\]

The primary objective of this course is to understand Bayes estimators:

- Why they are so powerful
- Their limitations (model misspecification, computational challenges)
- Important special cases (posterior means, credible intervals, MAP)
- How to use it in practice
- how to build models
**how to approximate conditional expectations**(MCMC methods)

- A bit of theory (asymptotics)

**Christian Robert, The Bayesian Choice, 2nd edition.**

- Available for free inside UBC VPN: https://tinyurl.com/y5sz278w
- Takes the general view of Bayesian analysis that I described in these slides
- Read Chapter 2 “Decision Theoretic foundation” (especially 2.1, 2.3)