DocumentCode :
3264109
Title :
Efficient cross-validation for feedforward neural networks
Author :
Kwok, Tin-Yau ; Yeung, Dit-Yan
Author_Institution :
Dept. of Comput. Sci., Hong Kong Univ. of Sci. & Technol., Kowloon, Hong Kong
Volume :
5
fYear :
1995
fDate :
Nov/Dec 1995
Firstpage :
2789
Abstract :
Studies the use of cross-validation for estimating the prediction risk of feedforward neural networks. In particular, the problem of variability due to the choice of random initial weights for learning is addressed. The authors demonstrate that nonlinear cross-validation may not be able to prevent the network from falling into the “wrong” perturbed local minimum. A modified approach that reduces the problem to a linear problem is proposed. It is more efficient and does not suffer from the local minimum problem. Simulation results for two regression problems are discussed
Keywords :
feedforward neural nets; generalisation (artificial intelligence); learning (artificial intelligence); feedforward neural networks; learning; nonlinear cross-validation; perturbed local minimum; prediction risk; random initial weights; regression problems; variability; Computer science; Degradation; Density measurement; Feedforward neural networks; Function approximation; Multi-layer neural network; Neural networks; Pattern classification; Testing; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1995. Proceedings., IEEE International Conference on
Conference_Location :
Perth, WA
Print_ISBN :
0-7803-2768-3
Type :
conf
DOI :
10.1109/ICNN.1995.488173
Filename :
488173
Link To Document :
بازگشت