Week / Module
|
Lecture Notes
|
Supplementals
|
Assignments
|
- Week 1 / Module 0: Intro, Decision Trees
- Topics:
- Course Overview
- Introduction to Machine Learning
- Data and its types
- Decision Trees for Classification and Regression
- Evaluating ML Algorithms
|
|
- Background:
- Programming: MATLAB, Java, Python, R
|
|
- Week 2 / Module 1: Linear Regression
- Topics:
- Linear Regression
- Background: Gradient Descent
- Ordinary Least Squares
- Normal Equations
|
|
|
|
- Week 3 / Module 1: Linear Regression
- Topics:
- Background: Probability Distributions
- Linear Regression: Maximum Likelihood
- Linear Regression: Maximum Aposteriori
|
|
|
|
- Week 4 / Module 2: Classification
- Topics:
- Classification: A Probabilistic Perspective
- Gaussian Discriminant Analysis
- Naive Bayes
- Logistic Regression
|
|
|
|
- Week 5 / Module 2: Classification
- Topics:
- Perceptrons
- Neural Networks
|
|
|
|
- Week 6 / Module 3: Support Vector Machines
- Topics:
- Large Margin Classifiers, Hinge Loss
- Background: Convex optimizatiion
- Background: Lagrange Multipliers, Duality
- Primal SVM
|
|
|
|
- Week 7 / Module 3: Support Vector Machines
- Topics:
- Dual SVM
- L1 and L2 SVM
- Optimizing the dual SVM
|
|
|
|
- Week 8 / Module 4: Kernel Methods
- Topics:
- Kernels and Inner Products
- Kernelizing Learning Algorithms
- Representer Theorem
|
|
|
|
- Week 12 / Module 5: Clustering
- Topics:
- K-Means Clustering
- Gaussian Mixture Models
- EM Algorithm
|
|
|
|
- Week 9 / Module 6: Ensemble Methods
- Topics:
- Bagging Classifiers
- Class imbalance and Bagging
- Random Forests
|
|
|
|
- Week 10 / Module 6: Ensemble Methods
- Topics:
- Boosting: Adaboost
- Boosting: Gradient Boosting
- Error correcting output codes
|
|
|
|
- Week 11 / Module 7: Feature Engineering
- Topics:
- Feature Selection: Wrapper Methods
- Feature Selection: Filter Methods
- Fisher Discriminant Analysis
- Principal Component Analysis
|
|
|
|