DocumentCode :
2268514
Title :
Assessing generalization of feedforward neural networks
Author :
Turmon, Michael J. ; Fine, Terrence L.
Author_Institution :
Sch. of Electr. Eng., Cornell Univ., Ithaca, NY, USA
fYear :
1995
fDate :
17-22 Sep 1995
Firstpage :
168
Abstract :
Neural networks have been used to tackle what might be termed `empirical regression´ problems. Given independent samples of input/output pairs (xi,yi), we wish to estimate f(x)=E[Y|X=x]. The approach taken is to choose an approximating class of networks N={η(x;w)}w∈W and within that class, by an often complex procedure, choose an approximating network η(·;w*). The distance (in mean squared error) of this network from f can be separated into two terms: one for approximation or bias-choosing N large enough so that some η(·;w0), say, models f well-and one for estimation or variance-how well the chosen η(·;w*) performs relative to η(·;w0). We address the latter term
Keywords :
approximation theory; estimation theory; feedforward neural nets; generalisation (artificial intelligence); statistical analysis; approximating class; approximating network; bias; distance; empirical regression problems; estimation; feedforward neural networks; generalization; input/output pairs; mean squared error; variance; Estimation error; Estimation theory; Feedforward neural networks; Gaussian processes; Neural networks; Upper bound; Virtual colonoscopy;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 1995. Proceedings., 1995 IEEE International Symposium on
Conference_Location :
Whistler, BC
Print_ISBN :
0-7803-2453-6
Type :
conf
DOI :
10.1109/ISIT.1995.531517
Filename :
531517
Link To Document :
بازگشت