H I G H L I G H T S
A real Caltech course, not a watered-down version
- Free, introductory Machine Learning course
- Taught by Caltech Professor Yaser Abu-Mostafa [article]
- Lectures recorded from a live broadcast, including Q&A
- Prerequisites: Basic probability, matrices, and calculus
- Homeworks with online grading and ranking
- Discussion forum for participants
This is an introductory course on machine learning that covers the basictheory, algorithms, and applications. Machine learning enables computational systems to adaptively improve their performance with experience accumulated from the observed data. It has become one of the hottest fields of study today, with applications in engineering, science, finance, and commerce. The course balances theory and practice, and covers the mathematical as well as the heuristic aspects. The lectures follow each other in a story-like fashion, with the main topics listed below.
The lectures are about 60 minutes each plus Q&A. You can also look for a particular topic in the Machine Learning Video Library.
- Lecture 1: The Learning Problem
- Lecture 2: Is Learning Feasible?
- Lecture 3: The Linear Model I
- Lecture 4: Error and Noise
- Lecture 5: Training versus Testing
- Lecture 6: Theory of Generalization
- Lecture 7: The VC Dimension
- Lecture 8: Bias-Variance Tradeoff
- Lecture 9: The Linear Model II
- Lecture 10: Neural Networks
- Lecture 11: Overfitting
- Lecture 12: Regularization
- Lecture 13: Validation
- Lecture 14: Support Vector Machines
- Lecture 15: Kernel Methods
- Lecture 16: Radial Basis Functions
- Lecture 17: Three Learning Principles
- Lecture 18: Epilogue
The story line from Lecture 1 to Lecture 18 is:
- What is learning?
- Can we learn?
- How to do it?
- How to do it well?
- Take-home lessons.
This course was broadcast live from the lecture hall at Caltech in April and May 2012. The lectures included live Q&A sessions with online audience participation. Here is a sample of a live lecture as the online audience saw it in real time.