Last year version: Fall 2021
Next year version: Fall 2023
This course provides a unifying introduction to statistical modeling of multidimensional data through the framework of probabilistic graphical models, together with their associated learning and inference algorithms.
Teacher: Simon Lacoste-Julien
Office hours: Wednesday 14:00-15:00 room H.04 in 6650 rue Saint-Urbain (see Slack post for directions)
TAs: Sebastien Lachapelle and Tristan Deleu
Office hours: Monday 14:00-15:00 room H.04
Tuesday 15:00-17:00
Friday 15:00-17:00
The lectures are taught in hybrid mode: the lectures will be given at the Mila Agora, 6650 rue Saint-Urbain, but also connected synchronously on Zoom (see info on Studium). They will also be recorded for further review or for those in remote time zones.
Probability review
Maximum likelihood estimation
Linear regression, logistic regression, Fisher discriminant
K-means, EM, Gaussian mixtures
Directed and undirected graphical models
Exponential family, information theory
Gaussian networks
Factor analysis
Sum-product algorithm, HMM, junction tree
Approximate inference: sampling, variational methods
Estimation of parameters in graphical models
Bayesian methods
Model selection
Homework (40%) – 5 homework | homework logistics below
Project (30%) – project report to hand in + poster presentation on Dec 16th | detailed info about projects
Final exam (30%) – take-home exam, after poster presentation - due Dec 23rd
The prerequisites are previous coursework in linear algebra, multivariate calculus, and basic probability and statistics. There will be programming for the assignments, so familiarity with some matrix-oriented programming language will be useful (we will use Python with numpy).
Warning: This class is quite mathematical, and the amount of work is significant (this is a 4 credits class, so expect at least 8 hours of work per week in addition to the lectures), so do not take it if you do not like maths or are looking for an easy class.
The course will follow the (unpublished) manuscript An Introduction to Probabilistic Graphical Models by Michael I. Jordan that will be made available to the students (but do not distribute!).
Supplementary references:
For very detailed and rigorous reference: Probabilistic Graphical Models: Principles and Techniques by Daphne Koller and Nir Friedman. Referred as KF in outline below.
See Part I of the Deep Learning book by Ian Goodfellow, Yoshua Bengio and Aaron Courville for a very gentle review of applied maths useful for this class. Chapter 5 contains a useful presentation of machine learning basics. Referred as DL in outline below.
Another classical book, with a more Bayesian perspective than Mike's book, but at least completed, is Pattern Recognition and Machine Learning, by Chris Bishop.
The homework is to be handed online in Studium before the beginning of the class (Tuesday) on the due date. The detailed instructions for the code handin-logistics can be found in the news in Studium.
Collaboration policy: you can collaborate with colleagues while working on the homework, but you need to write your own independent write-up. And if you have collaborated with others on a question, you need to credit the help of your colleagues by specifying them in the write-up (proper acknowledgment is good practice for you have for academia later).
Late homework policy:
You have a budget of 6 late days that you can spend on the 5 homework. To use these days, you need to declare it by filling the appropriate option in the Google form to hand in with the assignment.
The late day penalty will be the following (as deadlines are on Tuesday):
handed in Tuesday after beginning of class: 10% penalty (or 1 day late used)
handed in Wednesday: 20% penalty (or 1 day late used)
handed in Thursday: 40% penalty (or 2 days late used)
handed in Friday: 80% penalty (or 3 days late used)
handed in later: you won't get any credit (or you have to have used some of your late days to reduce the number of days counted late)
handed in Monday: 100% penalty (or 4 days late used)
handed in Tuesday (one week later): 100% penalty (or 5 days late used)
No assignment accepted more than one week late
Below is a draft detailed outline that will be updated as the class goes on. For now it is the outline recopied from the Fall 2021 version of this class, with the links to the relevant old scribbled notes that will be updated with the new ones gradually. The related chapters in Mike's book are given (but note that they do not exactly correspond with the class content), and also sometimes pointers to the Koller and Friedman book (KF), the Deep Learning book (DL) or the Bishop's book (B). Related past ‘‘scribe notes’’ from the previous instantiations of this class as well as (for later lectures) the class that I taught in Paris are given for now, and will be updated with the more updated scribe notes if needed (and I get some).
Date | Topics | Related chapters Scribbled notes Recordings | Scribe notes | Homework milestones |
Sept 6 (Tu) | Set-up & overview | lecture1 recording | Isabela Albuquerque (Fa17) lecture1.pdf source | |
Sept 9 (Fri) | Probability review | 2.1.1 DL: 3 (nice and gentle) KF: 2.1 (more rigorous) lecture2 recording | William Léchelle (Fa16) lecture2.pdf source | |
Sept 13 (Tu) | Prob. review (cont.) Parametric models | 5 lecture3 recording | Philippe Brouillard and Tristan Deleu (Fa17) lecture3.pdf source | Hwk 1 out (hwk 1 source) |
Sept 16 (Fri) | Frequentist vs. Bayesian Maximum likelihood | lecture4 recording | Philippe Brouillard and Tristan Deleu (Fa17) lecture4.pdf source | |
Sept 20 (Tu) | MLE (cont.) Statistical decision theory | 1.3 in Bickel & Doksum Bias-variance tradeoff: 7.3 in Hastie's book lecture5 recording | Sébastien Lachapelle (Fa17) lecture5.pdf source | |
Sept 23 (Fri) | Properties of estimators and MLE | lecture6 recording (2021) | same as lecture5 | |
Sept 27 (Tu) | Linear regression Logistic regression | DL: 7.1 (l2, l1-reg.) 6, 7 lecture7 recording | Zakaria Soliman (Fa16) lecture6.pdf source | Hwk 1 due Hwk 2 out |
Sept 30 (Fri) | Numerical Optimization Logistic regression (cted) and IRLS | 7 DL: 4.3 Boyd's book lecture8 recording | Eeshan Gunesh Dhekane and Younes Driouiche (Fa18) lecture8.pdf source | |
Oct 4 (Tu) | Gen. classification (Fisher) Derivative tricks for Gaussian MLE Kernel trick (skipped) | 7 Matrix Diff. book lecture9 recording lecture9 2017 (kernel trick) (skipped) | Eeshan Gunesh Dhekane (Fa18) lecture9.pdf source | |
Oct 7 (Fri) | latent variable model K-means EM and GMM | 10, 11 lecture10 recording | Ismael Martinez and Binulal Narayanan (Fa20) lecture12-2020.pdf source | |
Oct 11 (Tu) | Graph Theory Directed Graphical Models | 2 lecture11 recording | Martin Weiss and Eeshan Gunesh Dhekane (Fa18) lecture11.pdf source | |
Oct 14 (Fri) | DGM (cont.) Undirected graphical models | 2 lecture12 recording | Philippe Beardsell (Fa18) lecture12.pdf source | |
Oct 18 (Tu) | UGM (cont.) | 2 lecture13 recording | lecture12 above | Hwk 2 due Hwk 3 out (hwk 3 source) |
Oct 21 (Fri) | Inference: elimination alg. Sum-productr alg. | 3, 4 lecture14 recording | lecture13-2018.pdf source Sum-product: see lecture16-2020 below | |
Oct 25 (Tu) | Break: look at projects | |||
Nov 1 (Tu) | Max-product Junction tree HMM | 17, 12 lecture15 recording | lecture16-2020.pdf source lecture15-2018.pdf source | |
Nov 4 (Fri) | EM for HMM Information theory | 12, 19 lecture16 recording | lecture15-2018.pdf above | |
Nov 8 (Tu) | Max entropy MaxENT duality | 8 lecture17 (KL geometry: lecture 16 2017) recording (from 2020) | lecture16-2018.pdf source | Hwk 3 due Hwk 4 out (hwk 4 source) Project: team formed |
Nov 11 (Fri) | Exponential families Estimation in graphical models | 8 9 lecture18 recording | MVA lecture5 MVA lecture8 (new scribe notes to come!) | |
Nov 15 (Tu) | MC integration Sampling | 21 (variance reduction: see old lecture18 2017) lecture19 recording | ||
Nov 18 (Fri) | MCMC Markov chains Metropolis-Hastings | 21 lecture20 recording | ||
Nov 22 (Tu) | Gibbs sampling Variational methods | Bishop: 10.1 lecture21 recording (skipped parts of: old lecture22 2017 for marginal polytope) | MVA lecture9 | |
Nov 25 (Fri) | Gaussian networks Factor analysis, PCA, CCA (Kalman filter) VAE | lecture22 recording 13 old lecture17 Fa2016 14 , (15) old lecture18 Fa2016 VAE – DL: 20.10.3 | MVA lecture6.5 MVA lecture7.3 | |
Nov 29 (Tu) | (Guest lecture by Tristan Deleu) Bayesian methods Model selection Causality | 5, 26 lecture23 recording Causality for Machine Learning (arXiv 2019) Elements of Causal Inference (book) | MVA lecture10 | Hwk 4 due Hwk 5 out (hwk 5 source) |
Dec 2 (Fri) | (Guest lecture on causality by Sebastien Lachapelle) | lecture24 recording | Project: 1 page progress report due | |
Dec 6 (Tu) | Non-parametric models: Gaussian processes Dirichlet processes | 25 lecture25 recording GP book DP tutorial | lecture26-2020.pdf source | |
Dec 9 (Fri) | No lecture: work on hwk 5 or project! | |||
Dec 16 (Fri) | Poster presentation 2:00pm-5:00pm | Take-home final out | ||
Dec 23 (Fri) | Hwk 5 due Project report due Take-home final due | |||