Title :
New bounds for correct generalization
Author :
Mattera, Davide ; Palmieri, Francesco
Author_Institution :
Dipartimento di Ingegneria Elettronica, Naples Univ., Italy
Abstract :
A theoretical approach to the determination of the number of training examples for a neural network architecture is provided by the theory of Vapnik and Chervonenkis on the minimization of the empirical risk. We report here a new bound on the joint probability that both the approximation error between the binary function learned by the input/output examples and the target binary function is larger than ε and the empirical error on the examples is smaller than a fixed non-null fraction of ε. The given bounds are independent of the probability distribution on the input space and improve some existing results on the generalization abilities of an adaptive binary function
Keywords :
error statistics; generalisation (artificial intelligence); learning (artificial intelligence); minimisation; neural net architecture; probability; Vapnik-Chervonenkis theory; approximation error; binary function; bounds; empirical risk; error probability; generalization; minimization; neural network architecture; probability; Error correction; Error probability; Frequency measurement; Neural networks; Probability distribution; Risk analysis;
Conference_Titel :
Neural Networks,1997., International Conference on
Conference_Location :
Houston, TX
Print_ISBN :
0-7803-4122-8
DOI :
10.1109/ICNN.1997.616173