Title :
Feature selection for regularized least-squares: New computational short-cuts and fast algorithmic implementations
Author :
Pahikkala, Tapio ; Airola, Antti ; Salakoski, Tapio
Author_Institution :
Turku Centre for Comput. Sci., Univ. of Turku, Turku, Finland
fDate :
Aug. 29 2010-Sept. 1 2010
Abstract :
We propose novel computational short-cuts for constructing sparse linear predictors with regularized least-squares (RLS), also known as the least-squares support vector machine or ridge regression. The short-cuts make it possible to accelerate the search in the power set of features with leave-one-out criterion as a search heuristic. Our first short-cut finds the optimal search direction in the power set. The direction means either adding a new feature into the set of selected features or removing one of the previously added features. The second short-cut updates the set of selected features and the corresponding RLS solution according to a given direction. The computational complexities of both short-cuts are O(mn), where m and n are the numbers of training examples and features, respectively. The short-cuts can be used with various different feature selection strategies. As case studies, we present efficient implementations of greedy and floating forward feature selection algorithm for RLS.
Keywords :
computational complexity; feature extraction; least squares approximations; regression analysis; support vector machines; computational complexities; feature selection; least-squares support vector machine; regularized least-squares; ridge regression; Fitting;
Conference_Titel :
Machine Learning for Signal Processing (MLSP), 2010 IEEE International Workshop on
Conference_Location :
Kittila
Print_ISBN :
978-1-4244-7875-0
Electronic_ISBN :
1551-2541
DOI :
10.1109/MLSP.2010.5589210