Title :
Efficient cross-validation for feedforward neural networks
Author :
Kwok, Tin-Yau ; Yeung, Dit-Yan
Author_Institution :
Dept. of Comput. Sci., Hong Kong Univ. of Sci. & Technol., Kowloon, Hong Kong
Abstract :
Studies the use of cross-validation for estimating the prediction risk of feedforward neural networks. In particular, the problem of variability due to the choice of random initial weights for learning is addressed. The authors demonstrate that nonlinear cross-validation may not be able to prevent the network from falling into the “wrong” perturbed local minimum. A modified approach that reduces the problem to a linear problem is proposed. It is more efficient and does not suffer from the local minimum problem. Simulation results for two regression problems are discussed
Keywords :
feedforward neural nets; generalisation (artificial intelligence); learning (artificial intelligence); feedforward neural networks; learning; nonlinear cross-validation; perturbed local minimum; prediction risk; random initial weights; regression problems; variability; Computer science; Degradation; Density measurement; Feedforward neural networks; Function approximation; Multi-layer neural network; Neural networks; Pattern classification; Testing; Training data;
Conference_Titel :
Neural Networks, 1995. Proceedings., IEEE International Conference on
Conference_Location :
Perth, WA
Print_ISBN :
0-7803-2768-3
DOI :
10.1109/ICNN.1995.488173