DocumentCode :
2956320
Title :
The wellposedness analysis of the kernel adaline
Author :
Liu, Weifeng ; Príncipe, Jose C.
Author_Institution :
Dept. of Electr. & Comput. Eng., Univ. of Florida, Gainesville, FL
fYear :
2008
fDate :
1-8 June 2008
Firstpage :
1062
Lastpage :
1067
Abstract :
In this paper, we investigate the wellposedness of the kernel adaline. The kernel adaline finds the linear coefficients in a radial basis function network using deterministic gradient descent. We will show that the gradient descent provides an inherent regularization as long as the training is properly early-stopped. Along with other popular regularization techniques, this result is investigated in a unifying regularization-function concept. This understanding provides an alternative and possibly simpler way to obtain regularized solutions comparing with the cross-validation approach in regularization networks.
Keywords :
gradient methods; radial basis function networks; cross-validation approach; deterministic gradient descent; kernel adaline; linear coefficients; radial basis function network; regularization networks; regularization-function concept; wellposedness analysis; Bayesian methods; Cost function; Eigenvalues and eigenfunctions; Hilbert space; Intelligent networks; Kernel; Radial basis function networks; Singular value decomposition; Stability; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2008. IJCNN 2008. (IEEE World Congress on Computational Intelligence). IEEE International Joint Conference on
Conference_Location :
Hong Kong
ISSN :
1098-7576
Print_ISBN :
978-1-4244-1820-6
Electronic_ISBN :
1098-7576
Type :
conf
DOI :
10.1109/IJCNN.2008.4633930
Filename :
4633930
Link To Document :
بازگشت