IFT 6269 : Probabilistic Graphical Models - Fall 2019

Description

This course provides a unifying introduction to statistical modeling of multidimensional data through the framework of probabilistic graphical models, together with their associated learning and inference algorithms.

People

Class info

Tentative content

(Detailed outline below)

Evaluation

Prerequisites

Textbook

Homework logistics

Detailed outline (updated often)

Below is a draft detailed outline that will be updated as the class goes on. For now it is the outline recopied from the Fall 2018 version of this class, with the links to the relevant old scribbled notes that will be updated with the new ones gradually. The related chapters in Mike's book are given (but note that they do not exactly correspond with the class content), and also sometimes pointers to the Koller and Friedman book (KF), the Deep Learning book (DL) or the Bishop's book (B). Related past ‘‘scribe notes’’ from the class that I taught in Paris are given for now, and will be updated with the scribe notes as I get them (if I get some).

Date Topics Related chapters
Scribbled notes
Scribe notes Homework milestones
Sept 3 Set-up & overview intro slides
lecture1
Isabela Albuquerque (Fa17)
lecture1.pdf
source
Sept 6 Probability review 2.1.1
DL: 3 (nice and gentle)
KF: 2.1 (more rigorous)
lecture2
William L├ęchelle (Fa16)
lecture2.pdf
source
Sept 10 Parametric models
Frequentist vs. Bayesian
5
lecture3
Philippe Brouillard and Tristan Deleu (Fa17)
lecture3.pdf
source
Hwk 1 out
(hwk 1 source)
Sept 13 Bayesian (cont.)
Maximum likelihood
lecture4 Philippe Brouillard and Tristan Deleu (Fa17)
lecture4.pdf
Sept 17 MLE (cont.)
Statistical decision theory
1.3 in Bickel & Doksum
Bias-variance tradeoff: 7.3 in Hastie's book
lecture5
S├ębastien Lachapelle (Fa17)
lecture5.pdf
source
Sept 20 Properties of estimators old lecture6 same as lecture5
Sept 24 Linear regression
Logistic regression
DL: 7.1 (l2, l1-reg.)
6, 7
old lecture7
Zakaria Soliman (Fa16)
lecture6.pdf
source
Hwk 1 due
Hwk 2 out
Sept 27 Optimization
Logistic regression (cted) + IRLS
7
DL: 4.3
Boyd's book
old lecture8
MVA lecture2
Oct 1 Gen. classification (Fisher)
Derivative tricks for
Gaussian MLE
Kernel trick (skipped)
K-means

Matrix Diff. book
10, 11
lecture9 2017 (kernel trick)
old lecture9
MVA lecture3
Oct 4 GMM and EM 10,11
old lecture10
Oct 8 Graph theory
Directed graphical models
2
old lecture11
MVA lecture4 Hwk 2 due
Hkw 3 out
Oct 11 DGM (cont.)
Undirected graphical models
2
old lecture12
Oct 15 UGM (cont.)
Inference: elimination alg.
3
old lecture13
Oct 18 Sum-product alg.
Max-product
junction tree
4, 17
old lecture14
MVA lecture7
Oct 22 Break: look at projects
Oct 25 Break: look at projects
Oct 29 HMM and EM 12
old lecture15
Hwk 3 due
Hwk 4 out
Nov 1 Class cancelled: look at projects
Nov 5 Class cancelled: look at projects
Nov 8 Information theory
Max entropy
Duality
19
old lecture16
MVA lecture5
Nov 12 MaxENT duality
Exponential families
8
(KL geometry: old lecture16)
lecture17 2017
MVA lecture6
MVA lecture8
Project: team formed
Nov 15 Sampling 21
old lecture18
Nov 19 MCMC sampling 21
(variance reduction: see old lecture18 2017)
old lecture19
Nov 22 Non-parametric models: Gaussian processes
Dirichlet processes
25
old lecture20 slides
GP book
DP tutorial
Nov 26 MCMC (cont.)
Gibbs sampling
lecture21 Hwk 4 due
Hwk 5 out
Nov 29 Variational methods
Estimation in graphical models
Bishop: 10.1
old lecture22
9
(skipped parts of: old lecture22 2017)
Dec 3 Bayesian methods
Model selection
5, 26
old lecture23
MVA lecture10 Project: 1 page
progress report due
Dec 6 (last lecture) Gaussian networks

Factor analysis, PCA, CCA
(Kalman filter)
VAE
old lecture24

13
old lecture17 Fa2016
14 , (15)
old lecture18 Fa2016
VAE – DL: 20.10.3
Dec 10 No lecture this week (NeurIPS)
work on your project!
Dec 17 Poster presentation
Time and place TBC (most likely 1:00pm-4:00pm)
Hwk 5 due
Take-home final out
Dec 23 Project report due
Take-home final due (online)