DocumentCode :
2172224
Title :
Perturbation regulated kernel regressors for supervised machine learning
Author :
Kung, S.Y. ; Wu, Pei-yuan
Author_Institution :
Princeton Univ., Princeton, NJ, USA
fYear :
2012
fDate :
23-26 Sept. 2012
Firstpage :
1
Lastpage :
6
Abstract :
This paper develops a kernel perturbation-regulated (KPR) regressor based on the errors-in-variables models. KPR offers a strong smoothing capability critical to the robustness of regression or classification results. For Gaussian cases, the notion of orthogonal polynomials is instrumental to optimal estimation and its error analysis. More exactly, the regressor may be expressed as a linear combination of many simple Hermite Regressors, each focusing on one (and only one) orthogonal polynomial. For Gaussian or non-Gaussian cases, this paper formally establishes a “Two-Projection Theorem” allowing the estimation task to be divided into two projection stages: the first projection reveals the effect of model-induced error (caused by under-represented regressor models) while the second projection reveals the extra estimation error due to the (inevitable) input measuring error. The two-projection analysis leads to a closed-form error formula critical for order/error tradeoff. The simulation results not only confirm the theoretical prediction but also demonstrate superiority of KPR over the conventional ridge regression method in MSE reduction.
Keywords :
Gaussian processes; learning (artificial intelligence); pattern classification; polynomials; regression analysis; Gaussian cases; Hermite regressors; KPR regressor; MSE reduction; classification results; closed-form error formula; conventional ridge regression method; error analysis; errors-in-variables models; extra estimation error; input measuring error; model-induced error effect; order-error tradeoff; orthogonal polynomials; perturbation regulated kernel regressors; regression robustness; supervised machine learning; two-projection analysis; two-projection theorem; underrepresented regressor models; Estimation; Kernel; Polynomials; Regression analysis; Robustness; Smoothing methods; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning for Signal Processing (MLSP), 2012 IEEE International Workshop on
Conference_Location :
Santander
ISSN :
1551-2541
Print_ISBN :
978-1-4673-1024-6
Electronic_ISBN :
1551-2541
Type :
conf
DOI :
10.1109/MLSP.2012.6349743
Filename :
6349743
Link To Document :
بازگشت