DocumentCode :
1552184
Title :
A normalized gradient descent algorithm for nonlinear adaptive filters using a gradient adaptive step size
Author :
Mandic, Danilo P. ; Hanna, Andrew I. ; Razaz, Moe
Author_Institution :
Sch. of Inf. Syst., East Anglia Univ., Norwich, UK
Volume :
8
Issue :
11
fYear :
2001
Firstpage :
295
Lastpage :
297
Abstract :
A fully adaptive normalized nonlinear gradient descent (FANNGD) algorithm for online adaptation of nonlinear neural filters is proposed. An adaptive stepsize that minimizes the instantaneous output error of the filter is derived using a linearization performed by a Taylor series expansion of the output error. For rigor, the remainder of the truncated Taylor series expansion within the expression for the adaptive learning rate is made adaptive and is updated using gradient descent. The FANNGD algorithm is shown to converge faster than previously introduced algorithms of this kind.
Keywords :
adaptive filters; adaptive signal processing; convergence of numerical methods; filtering theory; gradient methods; nonlinear filters; series (mathematics); FANNGD algorithm; Taylor series expansion; adaptive learning rate; adaptive normalized nonlinear gradient descent; convergence; gradient adaptive step size; nonlinear adaptive filters; nonlinear neural filters; normalized gradient descent algorithm; online adaptation; output error minimisation; truncated Taylor series expansion; Adaptive filters; Adaptive systems; Convergence; Finite impulse response filter; Mathematical model; Neural networks; Neurons; Signal processing; Signal processing algorithms; Taylor series;
fLanguage :
English
Journal_Title :
Signal Processing Letters, IEEE
Publisher :
ieee
ISSN :
1070-9908
Type :
jour
DOI :
10.1109/97.969448
Filename :
969448
Link To Document :
بازگشت