Course Information

Course Info EECS 598-005, Fall 2015, 3 Credits
Instructor Jacob Abernethy Office: 3765 BBB, Email: jabernet_at_umich_dot_edu
Time, Place TuTh 3:00-4:30pm, 1005 DOW
Office Hours Wednesdays 1:30-3pm

Course Description

This course will study theoretical aspects of prediction problems, where we seek to understand themathematical underpinnings of machine learning. A primary objective of the class is to bring students to the frontiers of research in this area. The course will cover, among other things, concentration inqualities, uniform deviation bounds, Vapnik-Chervonenkis Theory, Rademacher Complexity, margin bounds, results for kernel methods, boosting, generalization bounds for artificial neural networks, online learning theory and regret minimization. Along the way, we will dive into several related topics, including connections to convex optimization, minimax equilibrium in games, multi-armed bandit problems, calibration, sequential portfolio selection, option pricing, and differential privacy.

Prerequisites: Familiarity with the analysis of algorithms, probabilistic analysis, and several similar topics. EECS 545 (Machine Learning) will be quite helpful but not strictly necessary. The material is going to be about 90% "theory" and thus potential students must have a strong mathematical background. We shall rely heavily on techniques from calculus, probability, and convex analysis, but many tools will be presented in lecture.

Coursework: There will be 5 problem sets, and the final project for the course will consist of the option to do independent research or to give a literature review presentation to the class.

Grade Breakdown

50% for Homeworks There will be 5 problem sets through the semester
40% for Final Project Students can do a final project on reviewing some research paper, doing novel research, or implementing some algorithms in an interesting way. More details on this to come.
10% for Participation Students must scribe 1-2 lectures, participate in class, and can receive participation credit for answering some challenge questions. I will try to make enough opportunities for this.

References

Roughly half of the course will follow material from the following text:

There is another text that has a few chapters I would like to cover:

  • "Prediction, Learning, and Games," by Nicolo Cesa-Bianchi and Gabor Lugosi

In the last several years, several surveys have come out that explore several topics that we shall cover. I will link to them here, and will mention them in various lectures when appropriate:

Scribe Notes

  1. Lecture 1, 9/8: Course Overview and Linear Algebra Review
  2. Lecture 2, 9/10: Convex Analysis
  3. Lecture 3, 9/15: Concentration Inequalities
  4. Lecture 4, 9/17: Hoeffding’s Inequality and Martingales
  5. Lecture 5, 9/22: PAC Learning Basics
  6. Lecture 6, 9/24: Risk and PAC-Learnible Class
  7. Lecture 7, 9/29: General PAC Guarantee
  8. Lecture 8, 10/01: PAC Guarantee for Infinite Sets and Growth Function
  9. Lecture 9, 10/06: Relations between Lectures and Sauer’s Lemma
  10. Lecture 10, 10/08: PAC Learning Lower Bounds
  11. Lecture 11, 10/13: Margin Theory
  12. Lecture 12, 10/15: Noisy Setting and Rademacher Complexity
  13. Lecture 13, 10/27: Rademacher Complexity and Massart's Lemma
  14. Lecture 14, 10/29: Growth Function Generalization Bound and Online Learning
  15. Lecture 15, 11/03: Neural Networks Theory
  16. Lecture 16, 11/05: Perceptron and Exponential Weights Algorithm
  17. Lecture 17, 11/10: Exponential Weights Algorithm and Game Theory
  18. Lecture 18, 11/12: Nash’s Theorem and Von Neumann’s Minimax Theorem
  19. Lecture 19, 11/17: Fast Rates I
  20. Lecture 20, 11/19: Fast Rates II
  21. Lecture 21, 12/01: Bandit Problem
  22. Lecture 22, 12/03: Adversarial Multi-Armed Bandits and EXP3
  23. Lecture 23, 12/08: Online Convex Optimization

Homeworks

  1. Homework #1 - Due 9/28/2015
  2. Homework #2 - Due 10/16/2015
  3. Homework #3 - Due 10/30/2015
  4. Homework #4 - Due 11/20/2015
  5. Homework #5 - Due 12/10/2015