DocumentCode
1748795
Title
Enhancing the generalization ability of neural networks by using Gram-Schmidt orthogonalization algorithm
Author
Wan, Weishui ; Hirasawa, Kotaro ; Hu, Jinglu ; Murata, Junichi
Author_Institution
Intelligent Control Lab., Kyushu Univ., Fukuoka, Japan
Volume
3
fYear
2001
fDate
2001
Firstpage
1721
Abstract
The generalization ability of neural networks is an important criterion when determining whether one algorithm is powerful or not. Many new algorithms have been devised to enhance the generalization ability of neural networks. In this paper an algorithm using the Gram-Schmidt orthogonalization algorithm on the outputs of nodes in the hidden layers is proposed with the aim to reduce the interference among the nodes in the hidden layers, which is much more efficient than the regularizers methods. Simulation results confirm the above assertion
Keywords
backpropagation; function approximation; generalisation (artificial intelligence); neural nets; sparse matrices; Gram-Schmidt orthogonalization algorithm; generalization ability; hidden layers; neural networks; Backpropagation algorithms; Error correction; Information science; Intelligent control; Interference; Laboratories; Neural networks; Neurons; Sparse matrices; Weight control;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location
Washington, DC
ISSN
1098-7576
Print_ISBN
0-7803-7044-9
Type
conf
DOI
10.1109/IJCNN.2001.938421
Filename
938421
Link To Document