DocumentCode :
3614406
Title :
Finding optimal architecture and weights using evolutionary least square based learning
Author :
R. Ghosh;B. Verma
Author_Institution :
Sch. of Inf. Technol., Griffith Univ., Australia
Volume :
1
fYear :
2002
fDate :
6/24/1905 12:00:00 AM
Firstpage :
528
Abstract :
In this paper, we present a novel idea of implementing a growing neural network architecture using an evolutionary least square based algorithm. This paper focuses mainly on the following aspects, such as the heuristics of updating weights using an evolutionary least square based algorithm, finding the number of hidden neurons for a two layer feed forward multilayered perceptron (MLP), the stopping criteria for the algorithm and finally comparisons of the results with other traditional methods for searching optimal or near optimal solution in the multidimensional complex search space comprising the architecture and the weight variables. We applied our proposed algorithm for XOR data set, 10 bit odd parity problem and many real bench mark data set like handwriting dataset from CEDAR and breast cancer, heart disease data set from UCI ML repository. The comparison results, based on classification accuracy and the time complexity are discussed. We also discuss the issues of finding a probabilistic solution space as a starting point for the least square method and address the problems involving fitness breaking.
Keywords :
"Least squares methods","Artificial neural networks","Evolutionary computation","Neural networks","Random access memory","Convergence","Biological cells","Information technology","Gold","Postal services"
Publisher :
ieee
Conference_Titel :
Neural Information Processing, 2002. ICONIP ´02. Proceedings of the 9th International Conference on
Print_ISBN :
981-04-7524-1
Type :
conf
DOI :
10.1109/ICONIP.2002.1202226
Filename :
1202226
Link To Document :
بازگشت