Instructor: Jiayu Zhou [email]
Time: Tuesday and Thursday 2:40pm - 4:00pm
Location: Engineering Building, EB 2205
Office Hours: Tuesday and Thursday 4:00pm-5:00pm, EB 2134
Machine Learning is concerned with computer programs that automatically improve their performance through experience (e.g., that learn to spot high-risk medical patients, recognize speech, classify text documents, detect credit card fraud, or drive autonomous robots). This course provides an in-depth understanding of machine learning and statistical pattern recognition techniques and their applications in biomedical informatics, computer vision, and other domains.
Topics: probability distributions, regression, classification, kernel methods, clustering, semi-supervised learning, mixture models, graphical models, dimensionality reduction, manifold learning, sparse learning, multi-task learning, transfer learning, and Hidden Markov Models.
Homework assignments include both theoretic derivation and hands-on experiments with various learning algorithms. Every student is required to finish a project that is either assigned by the intructor or designed by the student himself.
Announcements will be emailed to the course mailing list. A welcome note will be sent to the mailing list at the beginning of the semester. If you do not receive the welcome message before the first class, please send mail to me.
The grade will be calculated as follows:
- Assignments: 40%
- Project: 25%
- Exam (Midterm: March 15 Tue, Final): 30%
- Class participation: 5%
Lateness and Extensions
Homework is worth full credit at the beginning of class on the due date (later if an extension has been granted). It is worth at most 90% credit for the next 24 hours. It is worth at most 50% credit for the following 24 hours. It is worth 25% credit after that. If you need an extension, please ask for it (by sending email to the instructor) as soon as the need for it is known. Extensions that are requested promptly will be granted more liberally. You must turn in all assignments.
Collaboration among Students
The purpose of student collaboration is to facilitate learning, not to circumvent it. Studying the material in groups is strongly encouraged. It is also allowed to seek help from other students in understanding the material needed to solve a particular homework problem, provided no written notes are shared, or are taken at that time, and provided learning is facilitated, not circumvented. The actual solution must be done by each student alone, and the student should be ready to reproduce their solution upon request. Any form of help or collaboration must be disclosed in full by all involved on the first page of their assignment. In any case, you must exercise academic integrity.
- Topic 1. Introduction [Lecture Notes]
- Topic 2. Basics - Probability Theory [Lecture Notes]
- Topic 3. Basics - Linear Algebra [Lecture Notes]
- Topic 4. Basics - Linear Algebra - SVD [Notes on SVD]
- Topic 5. Linear Models for Regression [Lecture Notes Part I] [Lecture Notes Part II] [Lecture Notes Part III]
- Topic 6. Linear Models for Classification [Lecture Notes]
- Topic 7. Support Vector Machines [Lecture Notes]
- Topic 8. Kernel Methods [Lecture Notes]
- Topic 9. Ensemble [Lecture Notes Part I] [Lecture Notes Part II]
- Topic 10. Tree Methods [Lecture Notes]
- Topic 11. Clustering/Mixture Models [Lecture Notes]
- Topic 12. Deep Learning [Lecture Info]
- Topic 13. Dimensionality Reduction [Lecture Notes]
- Topic 14. Graphical Model (Not Covered in Class) [Lecture Notes]
- Topic 15. Sparse Learning [Lecture Notes]
- Topic 16. Matrix Completion and Collaborative Filtering [Lecture Notes]
- Topic 17. Transfer and Multi-Task Learning (Not Covered in Class) [Lecture Notes]
Please carefully read the Lateness and Extensions section for policies.
- [Assignment 1] Due on Thursday, Jan 28.
- [Assignment 2] Due on Thursday, Feb 16.
- [Assignment 3] Due on Tuesday, March 15.
- [Assignment 4] Due on Tuesday, April 5.
- [Assignment 5] Due on Tuesday, April 19.
- [Assignment 6] Due on Tuesday, May 3 (Bonus, optional).
The schedule is tentative and may subject to change.
-
Textbook: Pattern Recognition and Machine Learning, Christopher M. Bishop, 2006. Webpage
-
Reference book: The Elements of Statistical Learning: Data Mining, Inference, and Prediction (Second Edition) by Trevor Hastie, Robert Tibshirani and Jerome Friedman (2009) Book
- Gradient computation w.r.t. a vector/matrix
https://www.math.uwaterloo.ca/~hwolkowi/matrixcookbook.pdf
- Shorter materials
http://ai.stanford.edu/~paskin/gm-short-course/lec1.pdf
http://www.sci.utah.edu/~gerig/CS6640-F2010/prob-tut.pdf
- Longer books
http://mplab.ucsd.edu/tutorials/ProbabilityAndStats.pdf
https://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/amsbook.mac.pdf
- Lecture notes from Andrew Ng:
http://cs229.stanford.edu/notes/cs229-notes1.pdf
- If you are interested in systemtically studying the optimization knowledge, try reading the book Convex Optimization:
https://web.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf
The basic gradient descent is decribed in Page 463 (Page 477 of the PDF file).