Title :
Adaptive error-constrained backpropagation algorithm
Author :
Choi, Sooyong ; Ko, KyunByong ; Hong, Daesik
Author_Institution :
Dept. of Electr. & Comput. Eng., Yonsei Univ., Seoul, South Korea
Abstract :
In order to accelerate the convergence speed of the conventional BP algorithm, constrained optimization techniques are applied to the BP algorithm. First, the noise-constrained least mean square algorithm and the zero noise-constrained LMS algorithm are applied (designated the NCBP and ZNCBP algorithms, respectively). These methods involve an important assumption: the filter or the receiver in the NCBP algorithm must know the noise variance. By means of extention and generalization of these algorithms, the authors derive an adaptive error-constrained BP algorithm and its simplified algorithm, in which the error variance is estimated. This is achieved by modifying the error function of the conventional BP algorithm using Lagrangian multipliers. The convergence speeds of the proposed algorithms are 20 to 30 times faster than those of the conventional BP algorithm, and are faster than or almost the same as that achieved with a conventional linear adaptive filter using an LMS algorithm
Keywords :
backpropagation; constraint handling; least mean squares methods; neural nets; Lagrangian multipliers; adaptive error-constrained BP algorithm; adaptive error-constrained backpropagation algorithm; constrained optimization; convergence speed; error variance; neural networks; noise variance; noise-constrained least mean square algorithm; Acceleration; Additive noise; Backpropagation algorithms; Convergence; Filters; Gaussian noise; Lagrangian functions; Least squares approximation; Neural networks; Signal processing algorithms;
Conference_Titel :
Neural Networks for Signal Processing XI, 2001. Proceedings of the 2001 IEEE Signal Processing Society Workshop
Conference_Location :
North Falmouth, MA
Print_ISBN :
0-7803-7196-8
DOI :
10.1109/NNSP.2001.943115