Title :
Guaranteed two-pass convergence for supervised and inferential learning
Author :
Healy, Michael J. ; Caudell, Thomas P.
Author_Institution :
Dept. of Electr. & Comput. Eng., New Mexico Univ., Albuquerque, NM, USA
fDate :
1/1/1998 12:00:00 AM
Abstract :
We present a theoretical analysis of a version of the LAPART adaptive inferencing neural network. Our main result is a proof that the new architecture, called LAPART 2, converges in two passes through a fixed training set of inputs. We also prove that it does not suffer from template proliferation. For comparison, Georgiopoulos et al. (1994) have proved the upper bound n-1 on the number of passes required for convergence for the ARTMAP architecture, where n is the size of the binary pattern input space. If the ARTMAP result is regarded as an n-pass, or finite-pass, convergence result, ours is then a two-pass, or fixed-pass, convergence result. Our results have added significance in that they apply to set-valued mappings, as opposed to the usual supervised learning model of affixing labels to classes
Keywords :
ART neural nets; convergence; inference mechanisms; learning (artificial intelligence); neural net architecture; pattern classification; LAPART 2; LAPART adaptive inferencing neural network; binary pattern input space; fixed-pass convergence; inferential learning; set-valued mappings; supervised learning; two-pass convergence; Adaptive systems; Convergence; Data mining; Data preprocessing; Logic testing; Neural networks; Resonance; Subspace constraints; Supervised learning; Upper bound;
Journal_Title :
Neural Networks, IEEE Transactions on