Title :
Towards more practical average bounds on supervised learning
Author :
Gu, Hanzhong ; Takahashi, Haruhisa
Author_Institution :
Dept. of Commun. & Syst. Eng., Univ. of Electro-Commun., Chofu, Japan
fDate :
7/1/1996 12:00:00 AM
Abstract :
In this paper, we describe a method which enables us to study the average generalization performance of learning directly via hypothesis testing inequalities. The resulting theory provides a unified viewpoint of average-case learning curves of concept learning and regression in realistic learning problems not necessarily within the Bayesian framework. The advantages of the theory are that it alleviates the practical pessimism frequently claimed for the results of the Vapnik-Chervonenkis (VC) theory and its alike, and provides general insights into generalization. Besides, the bounds on learning curves are directly related to the number of adjustable system weights. Although the theory is based on an approximation assumption, and cannot apply to the worst-case learning setting, the precondition of the assumption is mild, and the approximation itself is only a sufficient condition for the validity of the theory. We illustrate the results with numerical simulations, and apply the theory to examining the generalization ability of combination of neural networks
Keywords :
approximation theory; generalisation (artificial intelligence); interpolation; learning (artificial intelligence); neural nets; Vapnik-Chervonenkis theory; approximation; average bounds; average-case learning curves; concept learning; generalization; hypothesis testing inequality; neural networks; supervised learning; Algorithm design and analysis; Annealing; Bayesian methods; Neural networks; Numerical simulation; Predictive models; Sufficient conditions; Supervised learning; Testing; Virtual colonoscopy;
Journal_Title :
Neural Networks, IEEE Transactions on