Title :
Iteration Learning SGNN
Author :
Li, Aiguvo ; Yong, Huang ; Li, Zhanhuai
Author_Institution :
Sch. of Comput. Sci. & Eng., Northwestern Polytech. Univ., Xi´´an
Abstract :
Self-generating neural networks (SGNN) have been used in many fields such as classification, clustering, prediction, and recognition. However, self-generating neural networks would generate too larger self-generating neural trees (SGNT) to practical application for large training datasets and its classification precision were expected to improve. We investigate performances of a variant of SGNN, named iteration learning SGNN in this paper. In the experiments, we provided four performance criterions, which are learning time consume, the node numbers of SGNT, error classification sample number, and classification precision, to compare performances of iteration learning SGNN and classical SGNN. The experimental results show that iteration learning SGNN is superior to batch one in reducing node number of SGNT and improving classification precision. While, learning time in classical SGNN is shorter than that in iteration learning SGNN
Keywords :
iterative methods; learning (artificial intelligence); neural nets; trees (mathematics); classification precision; iteration learning; self-generating neural networks; self-generating neural trees; Application software; Chaos; Classification tree analysis; Computer science; Electronic mail; Joining processes; Neural networks; Neurons; Neutrons; Pattern recognition;
Conference_Titel :
Neural Networks and Brain, 2005. ICNN&B '05. International Conference on
Conference_Location :
Beijing
Print_ISBN :
0-7803-9422-4
DOI :
10.1109/ICNNB.2005.1614998