DocumentCode
3391003
Title
Kernel Classification via Integrated Squared Error
Author
Kim, JooSeuk ; Scott, Clayton D.
Author_Institution
Dept. of EECS, University of Michigan, Ann Arbor, MI, USA. E-mail: stannum@umich.edu
fYear
2007
fDate
26-29 Aug. 2007
Firstpage
783
Lastpage
787
Abstract
Nonparametric kernel methods are widely used and proven to be successful in many statistical learning problems. Wellknown examples include the kernel density estimate (KDE) for density estimation and the support vector machine (SVM) for classification. We propose a kernel classifier that optimizes an integrated squared error (ISE) criterion based on a "difference of densities" formulation. Our classifier is sparse, like SVMs, and performs comparably to state-of-the-art kernel methods. Furthermore, and unlike SVMs, the ISE criterion does not require the user to set any unknown regularization parameters. As a consequence, classifier training is faster than for support vector methods.
Keywords
Bandwidth; Bayesian methods; Costs; Decision theory; Electronic mail; Kernel; Quadratic programming; Statistical learning; Support vector machine classification; Support vector machines; difference of densities; integrated squared error; kernel methods; quadratic programming; sparse classifiers;
fLanguage
English
Publisher
ieee
Conference_Titel
Statistical Signal Processing, 2007. SSP '07. IEEE/SP 14th Workshop on
Conference_Location
Madison, WI, USA
Print_ISBN
978-1-4244-1198-6
Electronic_ISBN
978-1-4244-1198-6
Type
conf
DOI
10.1109/SSP.2007.4301366
Filename
4301366
Link To Document