This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
The EDRL research group works around a theoretical strain (embodied cognition), a methodological line (design-based research), and a disciplinary emphasis (mathematics). Thus, the laboratory hosts the full cycle of design-research projects that are geared to contribute to theory and practice of multi-modal mathematical learning and reasoning as well as to design theory.
K. Kawaguchi, L. Kaelbling, and Y. Bengio. (2017)cite arxiv:1710.05468Comment: To appear in Mathematics of Deep Learning, Cambridge University Press. All previous results remain unchanged.
M. Cerulli, A. Chioccariello, and E. Lemut. 5th CERME conference - congress of European Society for Research in Mathematics Education, Larnaca, Cyprus, (2007)
M. Cerulli, A. Chioccariello, and E. Lemut. Proceedings of the Fourth Congress of the European Society for Research in Mathematics Education (CERME 4), page 591-600. Sant Feliu de Guíxols, Spain,, (2005)