Title :
Convergence of a neural network classifier
Author :
Baras, John S. ; La Vigna, Anthony
Author_Institution :
Dept. of Electr. Eng., Maryland Univ., College Park, MD, USA
Abstract :
It is shown that the LVQ (learning vector quantization) learning algorithm converges to locally asymptotic stable equilibria of an ordinary differential equation. It is demonstrated that the learning algorithm performs stochastic approximation. Convergence of the Voronoi vectors is guaranteed under the appropriate conditions on the underlying statistics of the classification problem. The authors also present a modification to the learning algorithm which, it is argued, results in convergence of the LVQ for a larger set of initial conditions. Finally, it is shown that the LVQ is a general histogram classifier and that its risk converges to the Bayesian optimal risk as the appropriate parameters go to infinity with the number of past observations
Keywords :
convergence; learning systems; neural nets; pattern recognition; Bayesian optimal risk; Voronoi vectors; convergence; learning algorithm; learning vector quantization; locally asymptotic stable equilibria; neural network classifier; ordinary differential equation; stochastic approximation; Approximation algorithms; Bayesian methods; Convergence; Differential equations; H infinity control; Histograms; Neural networks; Statistics; Stochastic processes; Vector quantization;
Conference_Titel :
Decision and Control, 1990., Proceedings of the 29th IEEE Conference on
Conference_Location :
Honolulu, HI
DOI :
10.1109/CDC.1990.203918