Statistical Learning Theory
This page contains lecture notes for ECE 543, Statistical Learning Theory:
- pdf of the latest revision
Revision log:
- Mar 18, 2021
- minor typos fixed in Chapter 8
- Mar 11, 2021
- a key monotonicity condition is added to Theorem 8.4
- Feb 9, 2021
- minor typos fixed in Chapters 2 and 5
- Jan 28, 2021
- added a discussion of interpolation without sacrificing statistical optimality (Section 1.3)
- May 16, 2019
- too many changes to catalog, lots of bugfixes and new content
- Apr 4, 2018
- added a section on the analysis of stochastic gradient descent (Section 11.6)
added a new chapter on online optimization algorithms (Chapter 12)
- Mar 28, 2018
- revised and extended the section on convex analysis in Hilbert spaces (Section 11.1)
revised the section on stochastic gradient descent (Section 11.5)
- Mar 27, 2018
- Rademacher complexity bounds for neural nets (Section 6.5)
- Feb 28, 2018
- revised basic bounds via surrogate losses (Section 6.2)
added a section on AdaBoost (Section 6.4)
- Feb 19, 2018
- revised section on structural results for Rademacher averages (Chapter 4)
statement and proof of the contraction principle (Chapter 4)
proof of the Sauer-Shelah lemma via Pajor's theorem in Chapter 5
- Feb 7, 2018
- added learnability of finite concept classes in the model-free framework
streamlined the symmetrization argument in Chapter 4
streamlined the proof of Finite Class Lemma in Chapter 4
- Jan 25, 2018
- added PAC learnability of finite concept classes
- Jan 18, 2018
- streamlined the proof of McDiarmid's inequality
Note to instructors: You are welcome to use any portion of these lecture notes in your own classes without asking our permission, but please give us proper credit and please include a link to this web page (http://maxim.ece.illinois.edu/teaching/SLT) for the most recent revision.
Feedback is always welcome, especially bug reports. We particularly appreciate hearing from students or instructors outside UIUC who find this stuff useful (or useless).