This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
J. Helton, T. Mai, and R. Speicher. (2015)cite arxiv:1511.05330Comment: We have undertaken a major revision, mainly for the sake of clarity and readability.
A. Gorban, and I. Tyukin. (2018)cite arxiv:1801.03421Comment: Accepted for publication in Philosophical Transactions of the Royal Society A, 2018. Comprises of 17 pages and 4 figures.