University of Michigan, Fall 2011

Instructor: Clayton Scott

Classroom: Dow 1010

Time: MW 10:30--12:00

Office: 4433 EECS

Email: clayscot

Office hours: Mon. 2-4

GSI: Takanori Watanabe (takanori)

GSI office hours: Tue. 3-5, 2420 EECS

__Required text__: None. I will share my lecture notes prior to
each lecture.

__Primary recommended text__:

- Hastie, Tibshirani, and Friedman,
*The Elements of Statistical Learning: Data Mining, Inference, and Prediction*, Springer, Second Edition (available online).

__Other recommended texts__:

- Duda, Hart, and Stork,
*Pattern Classification*, Wiley, 2001. - Bishop,
*Pattern Recognition and Machine Learning*, Springer, 2006. - Sutton and Barto,
*Reinforcement Learning: An Introduction*, MIT Press, 1998

__Additional references__

- Devroye, Gyorfi, and Lugosi,
*A Probabilistic Theory of Pattern Recognition*, Springer, 1996. - Scholkopf and Smola,
*Learning with Kernels*, MIT Press, 2002 - Mardia, Kent, and Bibby,
*Multivariate Analysis*, Academic Press, 1979 (good for PCA, MDS, and factor analysis). - Boyd and Vandenberghe,
*Convex Optimization*, Cambridge University Press, 2004

__Prerequisites__: (the current formal prerequisite is currently
listed as EECS 492, Artificial Intelligence, but this is inaccurate)

- Probability: jointly distributed random variables, multivariate densities and mass functions, expectation, independence, conditional distributions, Bayes rule, the multivariate normal distribution
- Linear algebra: rank, nullity, linear independence, inner products, orthogonality, positive (semi-) definite matrices, eigenvalue decompositions.

__Topics__:

These are projected topics for 2011. I can also lecture on new topics depending on students' interest. Applications will be developed through Matlab programming exercises.

Supervised Learning- Nearest neighbors classification
- The Bayes classifier
- Linear discriminant analysis
- Logistic regression
- Naive Bayes
- Separating hyperplanes
- Least squares linear regression
- Locally linear regression
- Regularization
- Inner product kernels and the kernel trick
- Kernel ridge regression
- Constrained optimization
- Support vector machines

- Principal component analysis
- K-means clustering
- The EM algorithm for Gaussian mixture models
- Kernel density estimation

- Reinforcement learning
- Markov decision processes
- Optimal policies
- Learning policies from experience
- Value function approximation

- Model selection and error estimation
- Feature selection
- Spectral clustering
- Multidimensional scaling
- Nonlinear dimensionality reduction
- Decision trees
- Ensemble methods
- Boosting
- Hierarchical clustering
- Reproducing kernel Hilbert spaces.
- Neural networks
- Learning theory
- Dirichlet processes

__Grading__:

Homework: 30%

Midterm exam: 30%, Thursday November 10, 6-9 PM, location TBA.

Final project: 40%

__Homeworks__:

About four homeworks will be assigned before the midterm. After the
midterm you will be working on your project.

__Computer programming__

Most or all assignments will involve some computer programming.
MATLAB will serve as the official programming language of the course.
I will sometimes provide
you with fragments of code, or suggested commands, in MATLAB.

__Group work__:

Group work will take place on two levels. You will work on homeworks in
*small groups* of 2, and the final project in *large groups* of
3 or 4. I will help you find groups as needed.

__Make-up class__:

I expect to be attending a conference on the last
day of classes, Dec. 13. Therefore I am scheduling a make-up class for
Thursday, September 15, 6:00 - 7:30 PM, location TBD.

__Exam__: Thursday November 10, 6-9 PM.

Collaboration of any form will not be allowed. Allowed materials will be
specified in advance of the exam. Notify me this week if you have a
conflict.

__Final Project__:

There will be a final, open-ended group project. The project must explore
a methodology or application (and preferably both) not covered in the
lectures.

__Collaboration__:

Each group will turn in one product representative of the group.
Solutions to homework problems obtained from outside sources may not be
used.

__Honor Code__

All undergraduate and graduate students are expected to abide by the
College of Engineering Honor Code as stated in the Student Handbook and
the Honor Code Pamphlet.

__Students with Disabilities__

Any student with a documented disability needing academic adjustments or
accommodations is requested to speak with me during the first two weeks of
class. All discussions will remain confidential.