DocumentCode
938658
Title
The least mean fourth (LMF) adaptive algorithm and its family
Author
Walach, Eugene ; Widrow, Bernard
Volume
30
Issue
2
fYear
1984
fDate
3/1/1984 12:00:00 AM
Firstpage
275
Lastpage
283
Abstract
New steepest descent algorithms for adaptive filtering and have been devised which allow error minimization in the mean fourth and mean sixth, etc., sense. During adaptation, the weights undergo exponential relaxation toward their optimal solutions. Time constants have been derived, and surprisingly they turn out to be proportional to the time constants that would have been obtained if the steepest descent least mean square (LMS) algorithm of Widrow and Hoff had been used. The new gradient algorithms are insignificantly more complicated to program and to compute than the LMS algorithm. Their general form is
where
is the present weight vector,
is the next weight vector,
is the present error,
is the present input vector,
is a constant controlling stability and rate of convergence, and
is the exponent of the error being minimized. Conditions have been derived for weight-vector convergence of the mean and of the variance for the new gradient algorithms. The behavior of the least mean fourth (LMF) algorithm is of special interest. In comparing this algorithm to the LMS algorithm, when both are set to have exactly the same time constants for the weight relaxation process, the LMF algorithm, under some circumstances, will have a substantially lower weight noise than the LMS algorithm. It is possible, therefore, that a minimum mean fourth error algorithm can do a better job of least squares estimation than a mean square error algorithm. This intriguing concept has implications for all forms of adaptive algorithms, whether they are based on steepest descent or otherwise.
where
is the present weight vector,
is the next weight vector,
is the present error,
is the present input vector,
is a constant controlling stability and rate of convergence, and
is the exponent of the error being minimized. Conditions have been derived for weight-vector convergence of the mean and of the variance for the new gradient algorithms. The behavior of the least mean fourth (LMF) algorithm is of special interest. In comparing this algorithm to the LMS algorithm, when both are set to have exactly the same time constants for the weight relaxation process, the LMF algorithm, under some circumstances, will have a substantially lower weight noise than the LMS algorithm. It is possible, therefore, that a minimum mean fourth error algorithm can do a better job of least squares estimation than a mean square error algorithm. This intriguing concept has implications for all forms of adaptive algorithms, whether they are based on steepest descent or otherwise.Keywords
Adaptive filters; Least-pth approximation; Adaptive algorithm; Adaptive filters; Convergence; Error correction; Filtering algorithms; Least squares approximation; Minimization methods; Noise cancellation; Performance analysis; Signal processing algorithms;
fLanguage
English
Journal_Title
Information Theory, IEEE Transactions on
Publisher
ieee
ISSN
0018-9448
Type
jour
DOI
10.1109/TIT.1984.1056886
Filename
1056886
Link To Document