DocumentCode :
15540
Title :
Projection-Based Fast Learning Fully Complex-Valued Relaxation Neural Network
Author :
Savitha, Ramasamy ; Suresh, Smitha ; Sundararajan, N.
Author_Institution :
Sch. of Comput. Eng., Nanyang Technol. Univ., Singapore, Singapore
Volume :
24
Issue :
4
fYear :
2013
fDate :
Apr-13
Firstpage :
529
Lastpage :
541
Abstract :
This paper presents a fully complex-valued relaxation network (FCRN) with its projection-based learning algorithm. The FCRN is a single hidden layer network with a Gaussian-like sech activation function in the hidden layer and an exponential activation function in the output layer. For a given number of hidden neurons, the input weights are assigned randomly and the output weights are estimated by minimizing a nonlinear logarithmic function (called as an energy function) which explicitly contains both the magnitude and phase errors. A projection-based learning algorithm determines the optimal output weights corresponding to the minima of the energy function by converting the nonlinear programming problem into that of solving a set of simultaneous linear algebraic equations. The resultant FCRN approximates the desired output more accurately with a lower computational effort. The classification ability of FCRN is evaluated using a set of real-valued benchmark classification problems from the University of California, Irvine machine learning repository. Here, a circular transformation is used to transform the real-valued input features to the complex domain. Next, the FCRN is used to solve three practical problems: a quadrature amplitude modulation channel equalization, an adaptive beamforming, and a mammogram classification. Performance results from this paper clearly indicate the superior classification/approximation performance of the FCRN.
Keywords :
Gaussian processes; adaptive signal processing; array signal processing; complex networks; equalisers; learning (artificial intelligence); linear algebra; neural nets; nonlinear programming; pattern classification; quadrature amplitude modulation; random processes; transfer functions; FCRN; Gaussian-like sech activation function; QAM channel equalization; adaptive beamforming; benchmark classification problems; circular transformation; energy function; exponential activation function; fully complex valued relaxation neural network; hidden layer network; hidden neuron; linear algebraic equations; machine learning repository; mammogram classification; nonlinear programming; projection-based learning algorithm; quadrature amplitude modulation; random weight assignment; weight estimation; Approximation algorithms; Approximation methods; Biological neural networks; Calculus; Equations; Neurons; Training; Adaptive beamforming; classification; complex-valued neural network; energy function; quadrature amplitude modulation (QAM);
fLanguage :
English
Journal_Title :
Neural Networks and Learning Systems, IEEE Transactions on
Publisher :
ieee
ISSN :
2162-237X
Type :
jour
DOI :
10.1109/TNNLS.2012.2235460
Filename :
6414644
Link To Document :
بازگشت