Title :
A kernelized maximal-figure-of-merit learning approach based on subspace distance minimization
Author :
Byun, Byungki ; Lee, Chin-Hui
Author_Institution :
Sch. of ECE, Georgia Inst. of Technol., Atlanta, GA, USA
Abstract :
We propose a kernelized maximal-figure-of-merit (MFoM) learning approach to efficiently training a nonlinear model using subspace distance minimization. In particular, a fixed, small number of training samples are chosen in a way that the distance between function spaces constructed with a subset of training samples and with the entire training data set is minimized. This construction of the subset enables us to learn a nonlinear model efficiently while keeping the resulting model nearly optimal compared to the model from the whole training data set. We show that the subspace distance can be minimized through the Nystrom extension. Experimental results on various machine learning problems demonstrate clear advantages of the proposed technique over the case where the function space is built with randomly selected training samples. Additional comparisons with the model trained with the entire training samples show that the proposed technique achieves comparable results while reducing training time tremendously.
Keywords :
learning (artificial intelligence); Nystrδm extension; kernelized maximal-figure-of-merit learning approach; machine learning; nonlinear model; subspace distance minimization; training sample; Data models; Error analysis; Kernel; Measurement; Minimization; Training; Training data; Kernel machines; Performance metric; Subspace distance minimization; The nyström extension;
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2011 IEEE International Conference on
Conference_Location :
Prague
Print_ISBN :
978-1-4577-0538-0
Electronic_ISBN :
1520-6149
DOI :
10.1109/ICASSP.2011.5946732