
EECS 59804: Probabilistic Graphical Models in Vision and Beyond
Course Description
Probabilistic graphical models have emerged as a powerful formalism
for leveraging principled probability theory and discrete structured
data representations to model largescale problems involving hundreds
or even thousands of interrelated variables. The course will cover
probabilistic graphical models in detail starting from the basics and
pushing through contemporary results. Foundations of graphical
modeling will begin the lectures followed by a thorough discussion of
causal (Bayesian) and acausal (Markov) graphical models. Thereotical
bases of these two paradigms will be covered (e.g., Gibbs
distributions, HammersleyClifford theorem). Learning and inference
methods (exact, approximate, discrete and stochastic methods) will be
discussed in detail. Practical considerations for all three course
components (representation, learning and inference) will be covered
both through course discussion and assignments.
There will be an emphasis on driving problem formulations from computer vision
since it provides ample problems wellsuited for probabilistic graphical models
but our coverage will be broad and include robotics, natural language
processing, computation finance and other areas.
No prior course in computer
vision or other application area is needed (although it will help in familiarity
with some terminology).
Information Flow
This
courses uses CTools, Piazza and the instructor's website.
 CTools will be used for some minor things such as distributing
protected material, accessing grades.
CTools Link is
https://ctools.umich.edu/portal/site/b9118a406399473a93216a89e6a131b9
 This website will hold the schedule.
 Piazza
is used for announcements, news and discussions. The
piazza course website is
http://piazza.com/umich/winter2015/eecs59804/home
(this
link is also in the CTools site). Students should ensure they are enrolled in
the course. Nearly all questions you have about the course, both logistical and
technical should be posted to piazza (after you have already checked piazza to
ensure the same question has not already been answered). Only in the event of a
concern of privacy, should you directly email the instructor.
Materials Needed
The textbook for the course is Koller and Friedman, Probabilistic
Graphical Models: Principles and Techniques, MIT Press 2009. http://pgm.stanford.edu
The software library for the course is OpenGM:
http://hci.iwr.uniheidelberg.de/opengm2/
Schedule
Will be updated throughout the term as needed. The term is split
into topicweeks with the instructor lecturing on the Wednesday
meeting and studentteams lecturing on the subsequent Monday.
Row color denotes topic area:
Fundamentals
Representation
Inference
Learning
Week  Day  Topic  Reading  Miscellaneous 
1  W 1/7 
Introduction 
1,2  
1  M 1/12 
Foundations: Probability and Graph Theory 
2 

2  W 1/14 
Foundations: Practice and Programming 
OpenGM Manual


2  M 1/19 
No Class MLK 

3  W 1/21 
Representation: Bayesian Networks 
2,3,5 

3  M 1/26 
Students: raywang, madantrg, sabean, zmykevin 


3  W 1/28 
Representation: Bayesian Networks 
2,3,5 

3  M 2/2 
No Class (@NSF Panel) 

4  W 2/4 
Representation: Markov Networks 
4,8 
Project Team/Proposal Due 
4  M 2/9 
Student: 

Homework 1 Due 
5  W 2/11 
Representation: Markov Networks 
4,6,8 

5  M 2/16 
Student: 


6  W 2/18 
Inference: Variable Elimination 
9 

6  M 2/23 
Student: 


7  W 2/25 
Inference: Belief Propagation 
10 
Project State Update 1 Due 
7  M 3/2 
No Class Winter Break 

9  W 3/4 
No Class Winter Break 

9  M 3/9 
Student: 


10  W 3/11 
Inference: Sampling 
12 

10  M 3/16 
Student: 


11  W 3/18 
Inference: MAP Estimation 
13 

11  M 3/23 
Student: 


12  W 3/25 
Learning: MLE in Bayesian Networks 
16, 17 
Project Status Update 2 Due 
12  M 3/30 
Student: 


13  W 4/1 
Learning: MLE in Markov Networks 
20 

13  M 4/6 
Student: 


14  W 4/8 
Learning: Structure 
18 

14  M 4/13 
Student: 


15  W 4/15 
Learning: Missing Data 
19 
Final Project Paper Due 
15  M 4/20 
Poster / Demo Day 

Grading
 Homeworks (25%) There will be two analytical homework assignments given
in the first half of the semester. They will be mathematical problems to test
whether students are understanding the foundational topics. These assignments
will be done independently by each student.
 InClass Presentation (25%) Each student will give onetothree
inclass presentations in class. The presentation topic will be specified by
the instructor and vary from lecturing on a particular model or algorithm,
discussing a particular implementation of a model or algorithm, or presenting a
paper that applies ideas in probabilistic graphical models to various domains.
Depending on the size of the course, these inclass presentations may be in
small groups.
 Project (50%) Small groups of students will select, formulate and
implement a project involving probabilistic graphical models in a
realworld situation, such as image understanding, robot navigation,
stockmarket behavior, acoustic speech recognition, etc. Topics will be
selected and proposed by the group and then approved / rejectedforrevisions by
the instructor. Fullscale project developments will be made through the term
culminating in a conferencelength paper and a public poster/demo session at the
end of the term. In past semesters, such projects have gone on to be published
at firsttier conference and journal venues.
 No exams will be given in the course.
