Title :
An Optimal Basis for Feature Extraction With Support Vector Machine Classification Using The Radius-Margin Bound
Author :
Fortuna, J. ; Capson, D.
Author_Institution :
Dept. of Electr. & Comput. Eng., McMaster Univ., Hamilton, Ont.
Abstract :
A method is presented for deriving an optimal basis for features classified with a support vector machine. The method is based on minimizing the leave-one-out error which is approximated by the radius-margin bound. A gradient descent method provides a learning rule for the basis in an outer loop of an iteration. The inner loop performs support vector machine training and provides support vector coefficients on which the gradient descent depends. In this way, the derivation of a basis for feature extraction and the support vector machine are jointly optimized. The efficacy of the method is illustrated with examples from multi-dimensional synthetic data sets
Keywords :
feature extraction; gradient methods; pattern classification; support vector machines; feature extraction; gradient descent method; multidimensional synthetic data sets; radius-margin bound; support vector machine classification; Computer errors; Data mining; Feature extraction; Independent component analysis; Iterative algorithms; Kernel; Pattern recognition; Principal component analysis; Support vector machine classification; Support vector machines;
Conference_Titel :
Acoustics, Speech and Signal Processing, 2006. ICASSP 2006 Proceedings. 2006 IEEE International Conference on
Conference_Location :
Toulouse
Print_ISBN :
1-4244-0469-X
DOI :
10.1109/ICASSP.2006.1661338