DocumentCode :
2130260
Title :
Regularization of hidden layer unit response for neural networks
Author :
Taga, Kei ; Kameyama, Keisuke ; Toraichi, Kazuo
Author_Institution :
Graduate Sch. of Syst. & Inf. Eng., Tsukuba Univ., Ibaraki, Japan
Volume :
1
fYear :
2003
fDate :
28-30 Aug. 2003
Firstpage :
348
Abstract :
In this paper, we looked into two issues in pattern recognition using neural networks trained by back propagation (BP), namely inefficient learning and insufficient generalization. We observed that these phenomena are partly caused by the way the hidden layer units responds to the inputs. In order to solve the issues, we introduced regularization of the hidden layer unit response which amounts to suppressing the correlation among the response of the hidden layer units, and pruning the unit with the method unit fusion. The results of using the proposed technique were compared with the case of conventional technique in pattern recognition problems. From the results of the experiments, the rate of correct recognition increased when using regularization in the hidden layer unit response is performed, and it turned out that the required number of training epochs also decreases.
Keywords :
backpropagation; neural nets; pattern recognition; back propagation; hidden layer units responds; neural networks; pattern recognition; training epochs; unit fusion; Computer vision; Feature extraction; Neural networks; Pattern recognition; Statistics; Systems engineering and theory;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Communications, Computers and signal Processing, 2003. PACRIM. 2003 IEEE Pacific Rim Conference on
Print_ISBN :
0-7803-7978-0
Type :
conf
DOI :
10.1109/PACRIM.2003.1235788
Filename :
1235788
Link To Document :
بازگشت