Title :
Layered neural network training with model switching and hidden layer feature regularization
Author :
Kameyama, Keisuke ; Taga, Kei
Author_Institution :
Tsukuba Adv. Res. Alliance, Tsukuba Univ., Japan
Abstract :
This work introduces a scheme of layered neural network training, which incorporates a dynamical model alteration during training, and regularization of the features extracted in the hidden layer units. So far, use of Model Switching (MS), which is a simultaneous search scheme for an optimal model and parameter, proved to improve training efficiency and generalization ability as a side effect. In MS, the operation to switch the network to a different model involve orthogonalization of the features extracted in the hidden layer. Assuming that the orthogonalization contributes to the observed merits, joint use of MS and orthogonalization of the hidden layer feature by introducing a regularization term in the training, is introduced. The network trained by the proposed training scheme is applied to a pattern recognition problem, and some improvement in training efficiency and generalization ability were observed.
Keywords :
feature extraction; feedforward neural nets; generalisation (artificial intelligence); learning (artificial intelligence); features extraction; generalization; hidden layer feature regularization; hidden layer units; layered neural network training; model switching; orthogonalization; pattern recognition; supervised learning; Data mining; Feature extraction; Neural networks; Pattern recognition; Probability density function; Size measurement; Supervised learning; Switches; Systems engineering and theory;
Conference_Titel :
Neural Networks, 2003. Proceedings of the International Joint Conference on
Print_ISBN :
0-7803-7898-9
DOI :
10.1109/IJCNN.2003.1223769