DocumentCode :
1064880
Title :
Training fully recurrent neural networks with complex weights
Author :
Kechriotis, George ; Manolakos, Elias S.
Author_Institution :
Dept. of Electr. & Comput. Eng., Northeastern Univ., Boston, MA, USA
Volume :
41
Issue :
3
fYear :
1994
fDate :
3/1/1994 12:00:00 AM
Firstpage :
235
Lastpage :
238
Abstract :
In this brief paper, the Real Time Recurrent Learning (RTRL) algorithm for training fully recurrent neural networks in real time, is extended for the case of a recurrent neural network whose inputs, outputs, weights and activation functions are complex. A practical definition of the complex activation function is adopted and the complex form of the conventional RTRL algorithm is derived. The performance of the proposed algorithm is demonstrated with an application in complex communication channel equalization
Keywords :
learning (artificial intelligence); recurrent neural nets; telecommunication channels; transfer functions; RTRL algorithm; Real Time Recurrent Learning; activation functions; complex communication channel equalization; complex weights; fully recurrent neural networks; network inputs; network outputs; network weights; Adaptive equalizers; Backpropagation algorithms; Communication channels; Digital signal processing; Limit-cycles; Neural networks; Recurrent neural networks; Signal processing algorithms; Speech processing; State-space methods;
fLanguage :
English
Journal_Title :
Circuits and Systems II: Analog and Digital Signal Processing, IEEE Transactions on
Publisher :
ieee
ISSN :
1057-7130
Type :
jour
DOI :
10.1109/82.279210
Filename :
279210
Link To Document :
بازگشت