DocumentCode :
3164136
Title :
Noise sensitivity signatures for model selection
Author :
Grossman, Tal ; Lapedes, Alan
Author_Institution :
Div. of Theor., Los Alamos Nat. Lab., NM, USA
Volume :
2
fYear :
1994
fDate :
9-13 Oct 1994
Firstpage :
213
Abstract :
Presents a method for calculating the “noise sensitivity signature” of a learning algorithm which is based on scrambling the output classes of various fractions of the training data. This signature can be used to indicate a good (or bad) match between the complexity of the classifier and the complexity of the data and hence to improve the predictive accuracy of a classification algorithm. Use of noise sensitivity signatures is distinctly different from other schemes to avoid overtraining, such as cross-validation, which uses only part of the training data, or various penalty functions, which are not data-adaptive. Noise sensitivity signature methods use all of the training data and are manifestly data-adaptive and nonparametric. They are well suited for situations with limited training data
Keywords :
learning (artificial intelligence); classifier; complexity; learning algorithm; model selection; noise sensitivity signatures; predictive accuracy; Accuracy; Automata; Curve fitting; Feeds; Inference algorithms; Neural networks; Pattern recognition; Polynomials; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pattern Recognition, 1994. Vol. 2 - Conference B: Computer Vision & Image Processing., Proceedings of the 12th IAPR International. Conference on
Conference_Location :
Jerusalem
Print_ISBN :
0-8186-6270-0
Type :
conf
DOI :
10.1109/ICPR.1994.576906
Filename :
576906
Link To Document :
بازگشت