Title :
Exploring recurrent learning for neurofuzzy networks using regularization theory
Author_Institution :
Dept. of Comput. Sci., Essex Univ., Colchester, UK
fDate :
6/24/1905 12:00:00 AM
Abstract :
This paper establishes a relation between recurrent neurofuzzy networks and regularized neurofuzzy networks, providing a natural and analytical way to explain why recurrent networks are better at multi-step prediction than feedforward networks. As a benefit from the established relation, a strategy for compromising the multi-step prediction ability and the divergence tendency in recurrent learning is developed
Keywords :
forecasting theory; fuzzy neural nets; learning (artificial intelligence); prediction theory; recurrent neural nets; feedforward networks; multistep prediction; recurrent learning; recurrent neurofuzzy networks; regularization theory; regularized neurofuzzy networks; Algorithm design and analysis; Associative memory; Fuzzy neural networks; Fuzzy sets; Modeling; Neural networks; Nonlinear dynamical systems; Partitioning algorithms; Recurrent neural networks; Spline;
Conference_Titel :
Neural Networks, 2002. IJCNN '02. Proceedings of the 2002 International Joint Conference on
Conference_Location :
Honolulu, HI
Print_ISBN :
0-7803-7278-6
DOI :
10.1109/IJCNN.2002.1007785