Title :
Studies of generalization for the LAPART-2 architecture
Author :
Caudell, Thomas P. ; Healy, Michael J.
Author_Institution :
Dept. of Electr. Eng. & Comput. Eng., New Mexico Univ., Albuquerque, NM, USA
Abstract :
This paper presents the results of a computer study of supervised learning and generalization in a new neural architecture that has extremely tight bounds on learning convergence. LAPART-2 is an extended version of the LAPART-1 introduced previously by the authors (1998). This paper explores the architectural generalisation through a series of numerical experiments using challenging problems in classification. This class of problem is used in this study because of the simplicity of correctness analysis and the availability of theoretical bounds on performance. Three classification problems were picked to initially study the generalization in LAPART-2 learning. Bayesian classification performance was calculated for each of the test problems for comparison. These experiments demonstrate that the generalization performance of LAPART-2 closely matches that of Bayesian for each of the problems, and converge in two or less epochs. LAPART-2 has one of the tightest theoretical bounds on learning convergence and excellent generalization performance
Keywords :
convergence; generalisation (artificial intelligence); learning (artificial intelligence); neural net architecture; pattern classification; LAPART-2; convergence; generalization; neural architecture; pattern classification; performance evaluation; supervised learning; Availability; Bayesian methods; Computer architecture; Convergence; Learning systems; Logic testing; Performance analysis; Supervised learning; Turning;
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-5529-6
DOI :
10.1109/IJCNN.1999.832687