DocumentCode :
1277885
Title :
On the regularization of forgetting recursive least square
Author :
Leung, Chi Sing ; Young, Gilbert H. ; Sum, John ; Kan, Wing-Kay
Author_Institution :
Dept. of Electron. Eng., Hong Kong Univ., Hong Kong
Volume :
10
Issue :
6
fYear :
1999
fDate :
11/1/1999 12:00:00 AM
Firstpage :
1482
Lastpage :
1486
Abstract :
The regularization of employing the forgetting recursive least square (FRLS) training technique on feedforward neural networks is studied. We derive our result from the corresponding equations for the expected prediction error and the expected training error. By comparing these error equations with other equations obtained previously from the weight decay method, we have found that the FRLS technique has an effect which is identical to that of using the simple weight decay method. This new finding suggests that the FRLS technique is another online approach for the realization of the weight decay effect. Besides, we have shown that, under certain conditions, both the model complexity and the expected prediction error of the model being trained by the FRLS technique are better than the one trained by the standard RLS method
Keywords :
feedforward neural nets; learning (artificial intelligence); least squares approximations; expected prediction error; expected training error; forgetting recursive least square; model complexity; regularization; training technique; weight decay method; Computer errors; Computer science; Equations; Feedforward neural networks; Helium; Least squares approximation; Least squares methods; Neural networks; Predictive models; Resonance light scattering;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.809093
Filename :
809093
Link To Document :
بازگشت