Title :
Training fully recurrent neural networks with complex weights
Author :
Kechriotis, George ; Manolakos, Elias S.
Author_Institution :
Dept. of Electr. & Comput. Eng., Northeastern Univ., Boston, MA, USA
fDate :
3/1/1994 12:00:00 AM
Abstract :
In this brief paper, the Real Time Recurrent Learning (RTRL) algorithm for training fully recurrent neural networks in real time, is extended for the case of a recurrent neural network whose inputs, outputs, weights and activation functions are complex. A practical definition of the complex activation function is adopted and the complex form of the conventional RTRL algorithm is derived. The performance of the proposed algorithm is demonstrated with an application in complex communication channel equalization
Keywords :
learning (artificial intelligence); recurrent neural nets; telecommunication channels; transfer functions; RTRL algorithm; Real Time Recurrent Learning; activation functions; complex communication channel equalization; complex weights; fully recurrent neural networks; network inputs; network outputs; network weights; Adaptive equalizers; Backpropagation algorithms; Communication channels; Digital signal processing; Limit-cycles; Neural networks; Recurrent neural networks; Signal processing algorithms; Speech processing; State-space methods;
Journal_Title :
Circuits and Systems II: Analog and Digital Signal Processing, IEEE Transactions on