DocumentCode
329044
Title
Estimating learning curves by PAC-learnability criterion
Author
Takahashi, Haruhisa ; Tomita, Etsuji
Author_Institution
Dept. of Commun. & Syst. Eng., Univ. of Electro-Commun., Tokyo, Japan
Volume
2
fYear
1993
fDate
25-29 Oct. 1993
Firstpage
1641
Abstract
This paper improves the sample complexity needed for reliable generalization in the PAC (probably approximately correct) learnability in neural networks, from which the learning curves are estimated. By taking the error supreme over the candidates of network realizations which are attained by minimizing the empirical error, we can refine the order of the sample complexity, whereas the previous methods take the supreme over the whole configuration space. Dimension analysis of concept classes, which is more simple to estimate in real systems than the Vapnik-Chervonenkis (VC) dimension, is introduced for calculating generalization error instead of the traditional VC dimension analysis.
Keywords
error analysis; estimation theory; learning (artificial intelligence); minimisation; neural nets; PAC-learnability criterion; Vapnik-Chervonenkis dimension; configuration space; dimension analysis; error minimisation; generalization error; learning curve estimation; learning curves; neural networks; sample complexity; Content addressable storage; Error correction; Neural networks; Physics; Probability; Reliability engineering; Risk analysis; Systems engineering and theory; Telecommunication network reliability; Virtual colonoscopy;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN
0-7803-1421-2
Type
conf
DOI
10.1109/IJCNN.1993.716966
Filename
716966
Link To Document