Advanced Statistical Inference
Academic Year: 2025
Course Information
This course focuses on the principles of learning from data and quantification of uncertainty, by complementing and enriching the βIntroduction to Statistical Learningβ and βMachine Learning and Intelligence Systemsβ courses. The course will be divided into two main parts, corresponding to βInferenceβ and βPredictionβ.
During the first half, we will focus on the principles of probabilistic inference, and the quantification of uncertainty in the context of machine learning. We will start by drawing connections between optimization and probabilistic inference (e.g., maximum likelihood estimation, maximum a posteriori estimation). We will then introduce the concept of Bayesian inference, and discuss how to perform inference in complex and intractable models using approximate methods (e.g., variational inference, Markov Chain Monte Carlo, Laplace approximation).
The second half of the course will focus on prediction, and the evaluation of predictive models. We will start by discussing simple predictive models (e.g., linear regression, logistic regression), and then move on to more complex models (e.g., Gaussian processes, neural networks). Notably, we will discuss all these models in the context of probabilistic inference, and we will put in practice the probabilistic methods introduced in the first half. While mostly focused on supervised learning, we will also brief take a look at unsupervised learning and generative models.
Finally, the course will be complemented by practical sessions, where students will be able to implement and experiment with the methods discussed in class (using Python).
Course Objectives
- Understand the principles of probabilistic inference and quantification of uncertainty.
- Understand the principles of Bayesian inference and how to perform (approximate) inference in complex models.
- Understand how to evaluate and compare predictive models, and how to quantify uncertainty in predictions.
- Apply the principles of probabilistic inference to a variety of models, including linear regression, logistic regression, Gaussian processes, and neural networks.
Prerequisites
- Probability theory and statistics.
- Linear algebra and calculus.
- Basic programming skills (Python).
- Basic knowledge of machine learning.
π’ Tentative schedule
Week | Topic |
---|---|
π Week 01 | Introduction and Revision of linear algebra and probability |
π Week 02 | Linear regression: Optimization and Bayesian Inference |
π Week 03 | Lab: Bayesian linear regression |
π Week 04 | Approximate inference 1: Monte Carlo methods, Markov Chain Monte Carlo |
π Week 05 | Approximate inference 2: Variational inference, Laplace Approximation |
π Week 06 | Classification: Bayesian logistic regression, Naive Bayes classifier |
π Week 07 | Lab: Bayesian logistic regression with MCMC |
π Week 08 | Gaussian processes |
π Week 09 | Lab: Gaussian processes |
π Week 10 | Lab: Bayesian logistic regression with variational inference |
π Week 11 | Neural networks and Bayesian neural networks |
π Week 12 | Generative models 1 |
π Week 13 | Generative models 2 |
π Week 14 | Lab: Generative models and course revision for exam |