ECE 299: Statistical Learning Theory (Spring 2011)
Announcements
- Office hours are 10 am to noon on Wednesdays in CIEMAS 2471. There will be no office hours on February 9.
- Homework 1 is out.
- Prof. Jake Bouvrie will give the lectures on February 7 and February 9.
- Homework 2 is out.
- Homework 3 is out.
- There will be no class on April 6.
- Homework 4 is out.
About this class
Statistical learning theory is a burgeoning research field at the intersection of probability, statistics, computer science, and optimization that studies the performance of computer algorithms for making predictions on the basis of training data.
The following topics will be covered: basics of statistical decision theory; concentration inequalities; supervised and unsupervised learning; empirical risk minimization; complexity-regularized estimation; generalization bounds for learning algorithms; VC dimension and Rademacher complexities; minimax lower bounds; online learning and optimization. Along with the general theory, we will discuss a number of applications of statistical learning theory to signal processing, information theory, and adaptive control.
Basic prerequisites include probability theory, calculus, and linear algebra. Other necessary material and background will be introduced as needed.
Other statistical learning theory classes
The rough outline of the course is fairly standard, although the precise selection of topics will reflect my own interests and expertise. Here is a sampling of similar courses at other institutions: