|
EECS 598-04: Probabilistic Graphical Models in Vision and Beyond
Course Description
Probabilistic graphical models have emerged as a powerful formalism
for leveraging principled probability theory and discrete structured
data representations to model large-scale problems involving hundreds
or even thousands of inter-related variables. The course will cover
probabilistic graphical models in detail starting from the basics and
pushing through contemporary results. Foundations of graphical
modeling will begin the lectures followed by a thorough discussion of
causal (Bayesian) and acausal (Markov) graphical models. Thereotical
bases of these two paradigms will be covered (e.g., Gibbs
distributions, Hammersley-Clifford theorem). Learning and inference
methods (exact, approximate, discrete and stochastic methods) will be
discussed in detail. Practical considerations for all three course
components (representation, learning and inference) will be covered
both through course discussion and assignments.
There will be an emphasis on driving problem formulations from computer vision
since it provides ample problems well-suited for probabilistic graphical models
but our coverage will be broad and include robotics, natural language
processing, computation finance and other areas.
No prior course in computer
vision or other application area is needed (although it will help in familiarity
with some terminology).
Information Flow
This
courses uses CTools, Piazza and the instructor's website.
- CTools will be used for some minor things such as distributing
protected material, accessing grades.
CTools Link is
https://ctools.umich.edu/portal/site/b9118a40-6399-473a-9321-6a89e6a131b9
- This website will hold the schedule.
- Piazza
is used for announcements, news and discussions. The
piazza course website is
http://piazza.com/umich/winter2015/eecs59804/home
(this
link is also in the CTools site). Students should ensure they are enrolled in
the course. Nearly all questions you have about the course, both logistical and
technical should be posted to piazza (after you have already checked piazza to
ensure the same question has not already been answered). Only in the event of a
concern of privacy, should you directly email the instructor.
Materials Needed
The textbook for the course is Koller and Friedman, Probabilistic
Graphical Models: Principles and Techniques, MIT Press 2009. http://pgm.stanford.edu
The software library for the course is OpenGM:
http://hci.iwr.uni-heidelberg.de/opengm2/
Schedule
Will be updated throughout the term as needed. The term is split
into topic-weeks with the instructor lecturing on the Wednesday
meeting and student-teams lecturing on the subsequent Monday.
Row color denotes topic area:
Fundamentals
Representation
Inference
Learning
Week | Day | Topic | Reading | Miscellaneous |
1 | W 1/7 |
Introduction |
1,2 | |
1 | M 1/12 |
Foundations: Probability and Graph Theory |
2 |
|
2 | W 1/14 |
Foundations: Practice and Programming |
OpenGM Manual
|
|
2 | M 1/19 |
No Class MLK |
|
3 | W 1/21 |
Representation: Bayesian Networks |
2,3,5 |
|
3 | M 1/26 |
Students: raywang, madantrg, sabean, zmykevin |
|
|
3 | W 1/28 |
Representation: Bayesian Networks |
2,3,5 |
|
3 | M 2/2 |
No Class (@NSF Panel) |
|
4 | W 2/4 |
Representation: Markov Networks |
4,8 |
Project Team/Proposal Due |
4 | M 2/9 |
Student: |
|
Homework 1 Due |
5 | W 2/11 |
Representation: Markov Networks |
4,6,8 |
|
5 | M 2/16 |
Student: |
|
|
6 | W 2/18 |
Inference: Variable Elimination |
9 |
|
6 | M 2/23 |
Student: |
|
|
7 | W 2/25 |
Inference: Belief Propagation |
10 |
Project State Update 1 Due |
7 | M 3/2 |
No Class Winter Break |
|
9 | W 3/4 |
No Class Winter Break |
|
9 | M 3/9 |
Student: |
|
|
10 | W 3/11 |
Inference: Sampling |
12 |
|
10 | M 3/16 |
Student: |
|
|
11 | W 3/18 |
Inference: MAP Estimation |
13 |
|
11 | M 3/23 |
Student: |
|
|
12 | W 3/25 |
Learning: MLE in Bayesian Networks |
16, 17 |
Project Status Update 2 Due |
12 | M 3/30 |
Student: |
|
|
13 | W 4/1 |
Learning: MLE in Markov Networks |
20 |
|
13 | M 4/6 |
Student: |
|
|
14 | W 4/8 |
Learning: Structure |
18 |
|
14 | M 4/13 |
Student: |
|
|
15 | W 4/15 |
Learning: Missing Data |
19 |
Final Project Paper Due |
15 | M 4/20 |
Poster / Demo Day |
|
Grading
- Homeworks (25%) There will be two analytical homework assignments given
in the first half of the semester. They will be mathematical problems to test
whether students are understanding the foundational topics. These assignments
will be done independently by each student.
- In-Class Presentation (25%) Each student will give one-to-three
in-class presentations in class. The presentation topic will be specified by
the instructor and vary from lecturing on a particular model or algorithm,
discussing a particular implementation of a model or algorithm, or presenting a
paper that applies ideas in probabilistic graphical models to various domains.
Depending on the size of the course, these in-class presentations may be in
small groups.
- Project (50%) Small groups of students will select, formulate and
implement a project involving probabilistic graphical models in a
real-world situation, such as image understanding, robot navigation,
stock-market behavior, acoustic speech recognition, etc. Topics will be
selected and proposed by the group and then approved / rejected-for-revisions by
the instructor. Full-scale project developments will be made through the term
culminating in a conference-length paper and a public poster/demo session at the
end of the term. In past semesters, such projects have gone on to be published
at first-tier conference and journal venues.
- No exams will be given in the course.
|