DocumentCode :
396688
Title :
Layered neural network training with model switching and hidden layer feature regularization
Author :
Kameyama, Keisuke ; Taga, Kei
Author_Institution :
Tsukuba Adv. Res. Alliance, Tsukuba Univ., Japan
Volume :
3
fYear :
2003
fDate :
20-24 July 2003
Firstpage :
2294
Abstract :
This work introduces a scheme of layered neural network training, which incorporates a dynamical model alteration during training, and regularization of the features extracted in the hidden layer units. So far, use of Model Switching (MS), which is a simultaneous search scheme for an optimal model and parameter, proved to improve training efficiency and generalization ability as a side effect. In MS, the operation to switch the network to a different model involve orthogonalization of the features extracted in the hidden layer. Assuming that the orthogonalization contributes to the observed merits, joint use of MS and orthogonalization of the hidden layer feature by introducing a regularization term in the training, is introduced. The network trained by the proposed training scheme is applied to a pattern recognition problem, and some improvement in training efficiency and generalization ability were observed.
Keywords :
feature extraction; feedforward neural nets; generalisation (artificial intelligence); learning (artificial intelligence); features extraction; generalization; hidden layer feature regularization; hidden layer units; layered neural network training; model switching; orthogonalization; pattern recognition; supervised learning; Data mining; Feature extraction; Neural networks; Pattern recognition; Probability density function; Size measurement; Supervised learning; Switches; Systems engineering and theory;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2003. Proceedings of the International Joint Conference on
ISSN :
1098-7576
Print_ISBN :
0-7803-7898-9
Type :
conf
DOI :
10.1109/IJCNN.2003.1223769
Filename :
1223769
Link To Document :
بازگشت