Title :
Prediction error and consistent parameter area in neural learning
Author :
Ikeda, Kazushi ; Amari, Shun-Ichi ; Yoshizawa, Shuji
Author_Institution :
Fac. of Eng., Tokyo Univ., Japan
Abstract :
The more the number of training examples increases, the better a learning machine will behave. It is an important problem to know how fast and how well the behavior is improved. The average prediction error is one of the most popular criteria to see the behavior. From the geometrical point of view, a training example encircles a region in the parameter space in which the true parameter should be included. The set of parameters is called the consistent area when any machine in the set can explain the input-output relation of the given examples. It is dual to the convex hull of examples in the input signal space. We have studied the stochastic geometrical features of the convex hull and derived the upper and lower bounds of the average prediction error of the simple perceptron network.
Keywords :
computational geometry; error analysis; learning (artificial intelligence); perceptrons; prediction theory; consistent parameter area; convex hull; lower bounds; neural learning; parameter space; perceptron network; prediction error; stochastic geometrical features; upper bounds; Bayesian methods; Machine learning; Probability distribution; Signal generators; Statistical distributions; Stochastic processes; Testing;
Conference_Titel :
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN :
0-7803-1421-2
DOI :
10.1109/IJCNN.1993.716963