This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
D. Diochnos, S. Mahloujifar, and M. Mahmoody. (2018)cite arxiv:1810.12272Comment: Full version of a work with the same title that will appear in NIPS 2018, 31 pages containing 5 figures, 1 table, 2 algorithms.
S. Yagli, A. Dytso, and H. Poor. (2020)cite arxiv:2005.02503Comment: Accepted for publication in Proceedings of 21st IEEE International Workshop on Signal Processing Advances in Wireless Communications (SPAWC), 2020. arXiv version is 10pt font, 6 Pages. This is the same document as the SPAWC version, except that the conference version is written with 9pt font to meet the strict page margin requirements.
W. Yan, A. Vangipuram, P. Abbeel, and L. Pinto. (2020)cite arxiv:2003.05436Comment: Project website: https://sites.google.com/view/contrastive-predictive-model.