DocumentCode :
1749199
Title :
A hybrid learning algorithm for multilayer perceptrons to improve generalization under sparse training data conditions
Author :
Tonomura, Masanobu ; Nakayama, Kenji
Author_Institution :
Graduate Sch. of Nat. Sci. & Technol., Kanazawa Univ., Japan
Volume :
2
fYear :
2001
fDate :
2001
Firstpage :
967
Abstract :
The backpropagation algorithm is mainly used for multilayer perceptrons. This algorithm is, however, difficult to achieve high generalization when the number of training data is limited, i.e. sparse training data. In this paper, a new learning algorithm is proposed. It combines the BP algorithm and modifies hyperplanes taking internal information into account. In other words, the hyperplanes are controlled by the distance between the hyperplanes and the critical training data, which locate close to the boundary. This algorithm works well for the sparse training data to achieve high generalization. In order to evaluate generalization, it is assumed that all data are normally distributed around the training data. Several simulations of pattern classification demonstrate the efficiency of the proposed algorithm
Keywords :
generalisation (artificial intelligence); learning (artificial intelligence); multilayer perceptrons; pattern classification; generalization; hybrid learning algorithm; multilayer perceptrons; pattern classification; sparse training data; Convergence; Covariance matrix; Distributed computing; Eigenvalues and eigenfunctions; Kernel; Learning systems; Multilayer perceptrons; Support vector machines; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-7044-9
Type :
conf
DOI :
10.1109/IJCNN.2001.939491
Filename :
939491
Link To Document :
بازگشت