DocumentCode
966768
Title
A Forward-Constrained Regression Algorithm for Sparse Kernel Density Estimation
Author
Hong, Xia ; Chen, Sheng ; Harris, Chris J.
Volume
19
Issue
1
fYear
2008
Firstpage
193
Lastpage
198
Abstract
Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward-constrained regression (FCR) manner. The proposed algorithm selects significant kernels one at a time, while the leave-one-out (LOO) test score is minimized subject to a simple positivity constraint in each forward stage. The model parameter estimation in each forward stage is simply the solution of jackknife parameter estimator for a single parameter, subject to the same positivity constraint check. For each selected kernels, the associated kernel width is updated via the Gauss-Newton method with the model parameter estimate fixed. The proposed approach is simple to implement and the associated computational cost is very low. Numerical examples are employed to demonstrate the efficacy of the proposed approach.
Keywords
Newton method; parameter estimation; regression analysis; Gauss-Newton method; classical Parzen window; forward-constrained regression algorithm; jackknife parameter estimator; leave-one-out test score; positivity constraint check; sparse kernel density estimation; Computational efficiency; Distribution functions; Gaussian processes; Kernel; Machine learning; Parameter estimation; Probability density function; Support vector machines; Testing; Training data; Cross validation; Parzen window (PW); jackknife parameter estimator; probability density function (pdf); sparse modeling; Algorithms; Artificial Intelligence; Humans; Neural Networks (Computer); Regression Analysis; Signal Processing, Computer-Assisted;
fLanguage
English
Journal_Title
Neural Networks, IEEE Transactions on
Publisher
ieee
ISSN
1045-9227
Type
jour
DOI
10.1109/TNN.2007.908645
Filename
4378278
Link To Document