@kirk86

Probabilistic Analysis of Learning in Artificial Neural Networks: The PAC Model and its Variants

. Neural Computing Surveys, (1997)

Abstract

There are a number of mathematical approaches to the study of learning and generalization in artificial neural networks. Here we survey the `probably approximately correct' (PAC) model of learning and some of its variants. These models provide a probabilistic framework for the discussion of generalization and learning. This survey concentrates on the sample complexity questions in these models; that is, the emphasis is on how many examples should be used for training. Computational complexity considerations are briefly discussed for the basic PAC model. Throughout, the importance of the Vapnik-Chervonenkis dimension is highlighted. Particular attention is devoted to describing how the probabilistic models apply in the context of neural network learning, both for networks with binary-valued output and for networks with real-valued output. Contents 1 Introduction 2 2 The Basic PAC Model of Learning 3 3 VC-Dimension and Growth Function 5 4 VC-Dimension and Linear Dimension 6 5 A Useful P...

Description

CiteSeerX — Probabilistic Analysis of Learning in Artificial Neural Networks: The PAC Model and its Variants

Links and resources

Tags