ECE 299: Statistical Learning Theory (Spring 2011)

Maxim Raginsky (m.raginsky@duke.edu)
MW 1:15-2:30, 1441 CIEMAS

About  |  Schedule  |  References  |  Coursework


Announcements


About this class

Statistical learning theory is a burgeoning research field at the intersection of probability, statistics, computer science, and optimization that studies the performance of computer algorithms for making predictions on the basis of training data.

The following topics will be covered: basics of statistical decision theory; concentration inequalities; supervised and unsupervised learning; empirical risk minimization; complexity-regularized estimation; generalization bounds for learning algorithms; VC dimension and Rademacher complexities; minimax lower bounds; online learning and optimization. Along with the general theory, we will discuss a number of applications of statistical learning theory to signal processing, information theory, and adaptive control.

Basic prerequisites include probability theory, calculus, and linear algebra. Other necessary material and background will be introduced as needed.


Other statistical learning theory classes

The rough outline of the course is fairly standard, although the precise selection of topics will reflect my own interests and expertise. Here is a sampling of similar courses at other institutions: