Title :
Learning curves of polynomial kernel classifiers
Author_Institution :
Kyoto Univ., Japan
Abstract :
The generalization properties of polynomial kernel classifiers are examined. Since a kernel classifier nonlinearly maps an input vector to a vector in a high-dimensional feature space and linearly discriminates it there, it has a similar learning curve to a linear dichotomy that has an average generalization error proportional to the dimension of the input space and inversely proportional to the number of given examples in the asymptotic limit. This paper shows that the asymptotic average generalization error depends on the relationship between the subset in the feature space on which the feature vectors lie and the true separating hyperplane, more specifically, the essential dimension of the feature space in the neighborhood of their intersection.
Keywords :
generalisation (artificial intelligence); learning (artificial intelligence); pattern classification; polynomials; set theory; vectors; asymptotic average generalization error; feature vector; high-dimensional feature space; linear dichotomy; machine learning curve; polynomial kernel classifier; subset; support vector machine;
Conference_Titel :
SICE 2004 Annual Conference
Conference_Location :
Sapporo
Print_ISBN :
4-907764-22-7