EECS 598-04
Graphical Models
Winter 2015

Course Flyer


Description and Outline
Information Flow

Full Teaching List

EECS 598-04: Probabilistic Graphical Models in Vision and Beyond

Instructor: Jason Corso (jjcorso)
Time/Place:MW 0900-1030 in 1003 EECS
Office Hours: M 1500-1600, W 1030-1130 in 4227 EECS

Course Description

Probabilistic graphical models have emerged as a powerful formalism for leveraging principled probability theory and discrete structured data representations to model large-scale problems involving hundreds or even thousands of inter-related variables. The course will cover probabilistic graphical models in detail starting from the basics and pushing through contemporary results. Foundations of graphical modeling will begin the lectures followed by a thorough discussion of causal (Bayesian) and acausal (Markov) graphical models. Thereotical bases of these two paradigms will be covered (e.g., Gibbs distributions, Hammersley-Clifford theorem). Learning and inference methods (exact, approximate, discrete and stochastic methods) will be discussed in detail. Practical considerations for all three course components (representation, learning and inference) will be covered both through course discussion and assignments.

There will be an emphasis on driving problem formulations from computer vision since it provides ample problems well-suited for probabilistic graphical models but our coverage will be broad and include robotics, natural language processing, computation finance and other areas. No prior course in computer vision or other application area is needed (although it will help in familiarity with some terminology).

Information Flow

This courses uses CTools, Piazza and the instructor's website.

  • CTools will be used for some minor things such as distributing protected material, accessing grades. CTools Link is
  • This website will hold the schedule.
  • Piazza is used for announcements, news and discussions. The piazza course website is (this link is also in the CTools site). Students should ensure they are enrolled in the course. Nearly all questions you have about the course, both logistical and technical should be posted to piazza (after you have already checked piazza to ensure the same question has not already been answered). Only in the event of a concern of privacy, should you directly email the instructor.

Materials Needed

The textbook for the course is Koller and Friedman, Probabilistic Graphical Models: Principles and Techniques, MIT Press 2009.
The software library for the course is OpenGM:


Will be updated throughout the term as needed. The term is split into topic-weeks with the instructor lecturing on the Wednesday meeting and student-teams lecturing on the subsequent Monday. Row color denotes topic area: Fundamentals  Representation  Inference  Learning 
1W 1/7 Introduction 1,2 
1M 1/12 Foundations: Probability and Graph Theory 2
2W 1/14 Foundations: Practice and Programming OpenGM Manual  
2M 1/19 No Class MLK  
3W 1/21 Representation: Bayesian Networks 2,3,5  
3M 1/26 Students: raywang, madantrg, sabean, zmykevin    
3W 1/28 Representation: Bayesian Networks 2,3,5  
3M 2/2 No Class (@NSF Panel)  
4W 2/4 Representation: Markov Networks 4,8 Project Team/Proposal Due
4M 2/9 Student:    Homework 1 Due
5W 2/11 Representation: Markov Networks 4,6,8  
5M 2/16 Student:     
6W 2/18 Inference: Variable Elimination 9  
6M 2/23 Student:     
7W 2/25 Inference: Belief Propagation 10 Project State Update 1 Due
7M 3/2 No Class Winter Break  
9W 3/4 No Class Winter Break  
9M 3/9 Student:     
10W 3/11 Inference: Sampling 12  
10M 3/16 Student:     
11W 3/18 Inference: MAP Estimation 13  
11M 3/23 Student:     
12W 3/25 Learning: MLE in Bayesian Networks 16, 17 Project Status Update 2 Due
12M 3/30 Student:     
13W 4/1 Learning: MLE in Markov Networks 20  
13M 4/6 Student:     
14W 4/8 Learning: Structure 18  
14M 4/13 Student:     
15W 4/15 Learning: Missing Data 19 Final Project Paper Due
15M 4/20 Poster / Demo Day  


  • Homeworks (25%) There will be two analytical homework assignments given in the first half of the semester. They will be mathematical problems to test whether students are understanding the foundational topics. These assignments will be done independently by each student.
  • In-Class Presentation (25%) Each student will give one-to-three in-class presentations in class. The presentation topic will be specified by the instructor and vary from lecturing on a particular model or algorithm, discussing a particular implementation of a model or algorithm, or presenting a paper that applies ideas in probabilistic graphical models to various domains. Depending on the size of the course, these in-class presentations may be in small groups.
  • Project (50%) Small groups of students will select, formulate and implement a project involving probabilistic graphical models in a real-world situation, such as image understanding, robot navigation, stock-market behavior, acoustic speech recognition, etc. Topics will be selected and proposed by the group and then approved / rejected-for-revisions by the instructor. Full-scale project developments will be made through the term culminating in a conference-length paper and a public poster/demo session at the end of the term. In past semesters, such projects have gone on to be published at first-tier conference and journal venues.
  • No exams will be given in the course.

last updated: Tue Sep 6 22:19:34 2016; copyright jcorso
Please report broken links to Prof. Corso .