Title :
Adaptive sparseness for supervised learning
Author :
Figueiredo, Mário A T
Author_Institution :
Dept. of Electr. & Comput. Eng., Inst. Superior Tecnico, Lisboa, Portugal
Abstract :
The goal of supervised learning is to infer a functional mapping based on a set of training examples. To achieve good generalization, it is necessary to control the "complexity" of the learned function. In Bayesian approaches, this is done by adopting a prior for the parameters of the function being learned. We propose a Bayesian approach to supervised learning, which leads to sparse solutions; that is, in which irrelevant parameters are automatically set exactly to zero. Other ways to obtain sparse classifiers (such as Laplacian priors, support vector machines) involve (hyper)parameters which control the degree of sparseness of the resulting classifiers; these parameters have to be somehow adjusted/estimated from the training data. In contrast, our approach does not involve any (hyper)parameters to be adjusted or estimated. This is achieved by a hierarchical-Bayes interpretation of the Laplacian prior, which is then modified by the adoption of a Jeffreys\´ noninformative hyperprior. Implementation is carried out by an expectation-maximization (EM) algorithm. Experiments with several benchmark data sets show that the proposed approach yields state-of-the-art performance. In particular, our method outperforms SVMs and performs competitively with the best alternative techniques, although it involves no tuning or adjustment of sparseness-controlling hyperparameters.
Keywords :
Bayes methods; generalisation (artificial intelligence); inference mechanisms; learning by example; maximum likelihood estimation; statistical analysis; Bayesian approaches; Laplacian prior; adaptive sparseness; benchmark data sets; expectation-maximization algorithm; experiments; functional mapping; generalization; hierarchical-Bayes interpretation; hyperparameters; noninformative hyperprior; sparse solutions; supervised learning; training examples; Automatic control; Bayesian methods; Feedforward neural networks; Kernel; Laplace equations; Neural networks; Supervised learning; Support vector machine classification; Support vector machines; Training data;
Journal_Title :
Pattern Analysis and Machine Intelligence, IEEE Transactions on
DOI :
10.1109/TPAMI.2003.1227989