Title :
Anisotropic noise injection for input variables relevance determination
Author :
Grandvalet, Yves
Author_Institution :
Univ. de Technol. de Compiegne, France
fDate :
11/1/2000 12:00:00 AM
Abstract :
There are two archetypal ways to control the complexity of a flexible regressor: subset selection and ridge regression. In neural-networks jargon, they are, respectively, known as pruning and weight decay. These techniques may also be adapted to estimate which features of the input space are relevant for predicting the output variables. Relevance is given by a binary indicator for subset selection, and by a continuous rating for ridge regression. This paper shows how to achieve such a rating for a multilayer perceptron trained with noise (or jitter). Noise injection (NI) is modified in order to penalize heavily irrelevant features. The proposed algorithm is attractive as it requires the tuning of a single parameter. This parameter controls the complexity of the model (effective number of parameters) together with the rating of feature relevances (effective input space dimension). Bounds on the effective number of parameters support that the stability of this adaptive scheme is enforced by the constraints applied to the admissible set of relevance indices. The good properties of the algorithm are confirmed by satisfactory experimental results on simulated data sets.
Keywords :
computational complexity; jitter; multilayer perceptrons; noise; NI; adaptive scheme; anisotropic noise injection; effective input space dimension; feature relevance rating; flexible regressor; input variables relevance determination; jitter; multilayer perceptron; neural networks; pruning; relevance indices; ridge regression; stability; subset selection; weight decay; Anisotropic magnetoresistance; Cost function; Covariance matrix; Input variables; Jitter; Multilayer perceptrons; Noise robustness; Robust control; Stability; Transfer functions;
Journal_Title :
Neural Networks, IEEE Transactions on