Title :
Using dynamic expansion and contraction approach to generate a feedforward network
Author :
Young, D. ; Cheng, L.M.
Author_Institution :
Dept. of Electron. Eng., City Polytech. of Hong Kong, Kowloon, Hong Kong
fDate :
27 Jun-2 Jul 1994
Abstract :
This paper proposes an algorithm to autonomously generate a feedforward network based on the input data. Dynamic expansion and contraction approach (DECA) is used to determine the optimal number of hidden nodes. The algorithm is applicable to the neural network used for function approximation and pattern classification. The short interval of train/test interleaving will minimize the learning time and avoid over-training the network. That is, a neural network for an application can be generated automatically in optimal time. Together with time-division-multiplexing architecture a hardware reconfigurable ANN architecture with learning capabilities can be realised
Keywords :
feedforward neural nets; function approximation; learning (artificial intelligence); pattern classification; dynamic expansion and contraction approach; feedforward network generation; function approximation; hardware reconfigurable ANN architecture; hidden nodes; learning time minimisation; pattern classification; time-division-multiplexing architecture; Application software; Approximation algorithms; Artificial neural networks; Contracts; Degradation; Function approximation; Hardware; Neural networks; Random number generation; Testing;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
DOI :
10.1109/ICNN.1994.374385