DocumentCode
2768412
Title
Learning of Kernel Functions in Support Vector Machines
Author
Yang, Chih-Cheng ; Lee, Wan-Jui ; Lee, Shie-Jue
Author_Institution
Nat. Sun Yat-Sen Univ., Kaohsiung
fYear
0
fDate
0-0 0
Firstpage
1150
Lastpage
1155
Abstract
The selection and learning of kernel functions is a very important but rarely studied problem in the field of support vector learning. However, the kernel function of a support vector machine has great influence on its performance. The kernel function projects the dataset from the original data space into the feature space, and therefore the problems which can not be done in low dimensions could be done in a higher dimension through the transform of the kernel function. In this paper, we introduce the gradient descent method into the learning of kernel functions. Using the gradient descent method, we can conduct learning rules of the parameters which indicate the shape and distribution of the kernel functions. Therefore, we can obtain better kernel functions by training of their parameters with respect to the risk minimization principle. The experimental results have shown that our approach can derive better kernel functions and thus has better generalization ability than other methods.
Keywords
gradient methods; learning (artificial intelligence); support vector machines; feature space; generalization ability; gradient descent method; kernel functions; risk minimization principle; support vector learning; support vector machines; Kernel; Machine learning; Pattern recognition; Risk management; Shape; Statistical learning; Support vector machine classification; Support vector machines; Training data; Virtual colonoscopy;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 2006. IJCNN '06. International Joint Conference on
Conference_Location
Vancouver, BC
Print_ISBN
0-7803-9490-9
Type
conf
DOI
10.1109/IJCNN.2006.246820
Filename
1716231
Link To Document