ECE 598MR: Statistical Learning Theory (Fall 2013)

Maxim Raginsky
TTh 9:30-10:50am, 106B3 Engineering Hall

About  |  Schedule  |  References  |  Coursework


Announcements


About this class

Statistical learning theory is a burgeoning research field at the intersection of probability, statistics, computer science, and optimization that studies the performance of computer algorithms for making predictions on the basis of training data.

The following topics will be covered: basics of statistical decision theory; concentration inequalities; supervised and unsupervised learning; empirical risk minimization; complexity-regularized estimation; generalization bounds for learning algorithms; VC dimension and Rademacher complexities; minimax lower bounds; online learning and optimization. Along with the general theory, we will discuss a number of applications of statistical learning theory to signal processing, information theory, and adaptive control.

Basic prerequisites include probability theory and random processes, calculus, and linear algebra. Other necessary material and background will be introduced as needed.


Other statistical learning theory classes

The rough outline of the course is fairly standard, although the precise selection of topics will reflect my own interests and expertise. Here is a link to a similar course I had taught at Duke University: Here is a sampling of similar courses at other institutions: