CS7140 - Spring 2018 - Advanced Machine Learning


Time: MW 2.50pm - 4.30pm
Room: Ryder 158


Jan-Willem van de Meent [personal page]
E-mail: contact
Phone: +1 617 373 7696
Office Hours: Monday 5.00pm - 6.00pm (or by appointment)

Teaching Assistant

Babak Esmaeli [personal page]
E-mail: contact
Office Hours: WVH 208, Wednesday 5.00pm - 6.00pm (or by appointment)


Course Overview

This course covers Bayesian methods for probabilistic modeling and inference. The topics covered are on the advanced end of the spectrum of those found in machine learning textbooks:

  • Graphical Models
  • Exponential Families
  • Mixture Models, Hidden Markov Models, Latent Dirichlet Allocation
  • Bayesian Regression, Gaussian Processes
  • Importance Sampling and Markov chain Monte Carlo Methods
  • Gradient estimators (likelihood-ratio, reparamterized)
  • Exact Inference on Trees
  • Variational Inference
  • Expectation Propagation

Towards the end of the semester, we will additionally cover topics that are the subject of active research, including:

  • Stochastic Variational Inference
  • Variational Autoencoders
  • Generative Adversarial Networks
  • Bayesian Optimization


CS 6140: Machine Learing, CS 6220: Data Mining Techniques, or an equivalent course.

Students are expected to have a good working knowledge of basic linear algebra, probability, statistics, and algorithms. While the lectures will be designed to be self-contained, and students are expected to be comfortable with the basic topics in machine learning (regression, classification, dimensionality reduction, etc.).


This class is not structured to directly follow the outline of a text book. In addition to the lecture notes, the textbook by Kevin Murphy will be used as a reference. The textbook by Barber can be used as a supplementary resource, and is freely available online.


  • Required: Kevin Murphy, Machine Learing: a Probabilistic Perspective, MIT Press, 2013 [Website].
  • Recommended: David Barber, Bayesian Reasoning and Machine Learning, Cambridge University Press, 2012 [PDF freely available].

Additional Materials


Each lecture will have a 2-3 assigned scribes who will be jointly responsible for producing PDF notes in LaTeX (we will make a template available). Each student will be assigned to two groups over the course of the semester. Notes will be due 1 week after the class takes place, and will be graded as part of the course work.


The homework in this class will consist of 5 problem sets, which will combine mathematical derivations with programming exercises in Python. Submissions must be made via blackboard by 11.59pm on the due date.


The goal of the project is to gain experience in implementing and testing one of the methods covered in the lectures. Students will collaborate in groups of 2-4. We will provide a list of suggested problems to choose from.

Participation and Collaboration

Students are expected to attend lectures and actively participate by asking questions. While students are required to complete homework programming exercises individually, helping fellow students by explaining course material is encouraged.


The final grade for this course will be weighted as follows:

  • Homework: 30%
  • Scribing: 20%
  • Midterm Exam: 15%
  • Final Exam: 15%
  • Course Project: 20%


Students will be asked to indicate the amount of time spent on each homework, as well as the project. The will also be able to indicate what they think went well, and what they think did not go well. There will also be an opportunity to provide feedback on the class after the midterm exam.


Note: This schedule is subject to change and will be adjusted as needed throughout the semester.

Week Date Lectures Assignments Reading
1 Mon 08 Jan Syllabus [slides], Probability, Inference, Conjugacy [slides, notes]   Murphy 2; Barber 1
  Wed 10 Jan Bayesian Regression, Kernel Ridge Regression [slides, notes]   Murphy 7; Barber 17.1-17.3,18.1
2 Mon 15 Jan (Martin Luther King Jr. Day)    
  Wed 17 Jan Gaussian Processes [slides, notes]   Murphy 14.1-14.4, 15; Barber 19.1-19.4
3 Mon 22 Jan Graphical Models [slides, notes]   Murphy 10; Barber 2-4
  Wed 24 Jan Importance Sampling, Sequential Monte Carlo [slides, notes] HW1 due (Fri) Murphy 23; Barber 27.6
4 Mon 29 Jan Metropolis-Hastings, Gibbs Sampling [slides, notes]   Murphy 24.1-24.4; Barber 27.1-27.4
  Wed 31 Jan Hamiltonian Monte Carlo [slides, notes ]   Murphy 24.5; Barber 27.5; Neal; Betancourt
5 Mon 05 Feb (No Class)    
  Wed 07 Feb Expectation Maximization [slides, notes] HW2 due (Fri) Murphy 11; Barber 11.1-11.4
6 Mon 12 Feb Variational Inference [slides, notes]   Murphy 21; Blei
  Wed 14 Feb Variational Inference (continued) [slides, notes]   Kucukelbir et al.
7 Mon 19 Feb (Presidents’ Day)    
  Wed 21 Feb Latent Dirichlet Allocation 1 [slides, notes]   Murphy 27.3; Barber 20.6.1
8 Mon 26 Feb Latent Dirichlet Allocation 2 [slides, notes] HW3 due (Sun) Murphy 27.3; Barber 20.6.1
  Wed 28 Feb (Midterm)    
9 Mon 05 Mar (Spring Break)    
  Wed 07 Mar (Spring Break)    
10 Mon 12 Mar (project discussion) [slides]    
  Wed 14 Mar Stochastic Variational Inference [slides, notes]   Hoffman et al. [JMLR 2013]
11 Mon 19 Mar Black Box Variational Inference [slides, notes]   Ranganath et al. [AISTATS 2014]
  Wed 21 Mar Variational Autoencoders [slides, notes]   Doersch [Tutorial]
12 Mon 26 Mar Variable Elimination, Belief Propagation [slides, notes]   Murphy 20.1-20.3; Barber 5.1
  Wed 28 Mar Loopy Belief Propagation, Expectation Propagation [slides, notes]   Murphy 22.1-22.5; Barber 28.7-28.8
13 Mon 02 Apr The Junction Tree Algorithm [slides, notes]   Murphy 20.4; Barber 6
  Wed 04 Apr Generative Adversarial Networks [slides]   Goodfellow [NIPS Tutorial]
14 Mon 09 Apr Bayesian Optimization [slides]    
  Wed 11 Apr (Review) [slides] HW4 due (Fri)  
15 Mon 16 Apr (patriots’ day)    
  Wed 18 Apr (Exam)    
16 Mon 23 Apr (No Class)    
  Wed 25 Apr Project Presentations Project due (Wed)