This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
T. Banica, S. Curran, and R. Speicher. (2009)cite arxiv:0907.3314Comment: Published in at http://dx.doi.org/10.1214/10-AOP619 the Annals of Probability (http://www.imstat.org/aop/) by the Institute of Mathematical Statistics (http://www.imstat.org).
R. Speicher. (2009)cite arxiv:0911.0087Comment: 21 pages; my contribution for the Handbook on Random Matrix Theory, to be published by Oxford University Press.
J. Helton, T. Mai, and R. Speicher. (2015)cite arxiv:1511.05330Comment: We have undertaken a major revision, mainly for the sake of clarity and readability.