Advanced Statistical Inference

Academic Year: 2025

Author
Affiliation

EURECOM

Course Information

This course focuses on the principles of learning from data and quantification of uncertainty, by complementing and enriching the β€œIntroduction to Statistical Learning” and β€œMachine Learning and Intelligence Systems” courses. The course will be divided into two main parts, corresponding to β€œInference” and β€œPrediction”.

During the first half, we will focus on the principles of probabilistic inference, and the quantification of uncertainty in the context of machine learning. We will start by drawing connections between optimization and probabilistic inference (e.g., maximum likelihood estimation, maximum a posteriori estimation). We will then introduce the concept of Bayesian inference, and discuss how to perform inference in complex and intractable models using approximate methods (e.g., variational inference, Markov Chain Monte Carlo, Laplace approximation).

The second half of the course will focus on prediction, and the evaluation of predictive models. We will start by discussing simple predictive models (e.g., linear regression, logistic regression), and then move on to more complex models (e.g., Gaussian processes, neural networks). Notably, we will discuss all these models in the context of probabilistic inference, and we will put in practice the probabilistic methods introduced in the first half. While mostly focused on supervised learning, we will also brief take a look at unsupervised learning and generative models.

Finally, the course will be complemented by practical sessions, where students will be able to implement and experiment with the methods discussed in class (using Python).

Course Objectives

  • Understand the principles of probabilistic inference and quantification of uncertainty.
  • Understand the principles of Bayesian inference and how to perform (approximate) inference in complex models.
  • Understand how to evaluate and compare predictive models, and how to quantify uncertainty in predictions.
  • Apply the principles of probabilistic inference to a variety of models, including linear regression, logistic regression, Gaussian processes, and neural networks.

Prerequisites

  • Probability theory and statistics.
  • Linear algebra and calculus.
  • Basic programming skills (Python).
  • Basic knowledge of machine learning.

πŸ“’ Tentative schedule

Week Topic
πŸ“… Week 01 Introduction and Revision of linear algebra and probability
πŸ“… Week 02 Linear regression: Optimization and Bayesian Inference
πŸ“… Week 03 Lab: Bayesian linear regression
πŸ“… Week 04 Approximate inference 1: Monte Carlo methods, Markov Chain Monte Carlo
πŸ“… Week 05 Approximate inference 2: Variational inference, Laplace Approximation
πŸ“… Week 06 Classification: Bayesian logistic regression, Naive Bayes classifier
πŸ“… Week 07 Lab: Bayesian logistic regression with MCMC
πŸ“… Week 08 Gaussian processes
πŸ“… Week 09 Lab: Gaussian processes
πŸ“… Week 10 Lab: Bayesian logistic regression with variational inference
πŸ“… Week 11 Neural networks and Bayesian neural networks
πŸ“… Week 12 Generative models 1
πŸ“… Week 13 Generative models 2
πŸ“… Week 14 Lab: Generative models and course revision for exam

Book an appointment