DocumentCode :
1749238
Title :
Lower bounds for empirical and leave-one-out estimates of the generalization error
Author :
Gavin, G. ; Teytaud, O.
Author_Institution :
ERIC, Lyon Univ., Mendes, France
Volume :
2
fYear :
2001
fDate :
2001
Firstpage :
1238
Abstract :
Usually re-sampling estimates are considered more efficient to estimate the generalization performances than the empirical error. In this paper we consider the leave one out estimate. We show that in the previous framework, it is not better than the empirical error. Moreover, we show that sometimes training error estimate is more efficient. The paper summarizes the framework of machine learning, defines the sample complexity, and recalls some usual results
Keywords :
computational complexity; generalisation (artificial intelligence); learning (artificial intelligence); learning systems; probability; generalization; learning error; leave-one-out estimates; lower bounds; machine learning; probability; sample complexity; Art; Frequency; Learning systems; Machine learning; Probability distribution; Statistical learning; Sufficient conditions; Upper bound;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-7044-9
Type :
conf
DOI :
10.1109/IJCNN.2001.939538
Filename :
939538
Link To Document :
بازگشت