Title :
Optimized Approximation Algorithm in Neural Networks Without Overfitting
Author :
Liu, Yinyin ; Starzyk, Janusz A. ; Zhu, Zhen
Author_Institution :
Sch. of Electr. Eng. & Comput. Sci., Ohio Univ., Athens, OH
fDate :
6/1/2008 12:00:00 AM
Abstract :
In this paper, an optimized approximation algorithm (OAA) is proposed to address the overfitting problem in function approximation using neural networks (NNs). The optimized approximation algorithm avoids overfitting by means of a novel and effective stopping criterion based on the estimation of the signal-to-noise-ratio figure (SNRF). Using SNRF, which checks the goodness-of-fit in the approximation, overfitting can be automatically detected from the training error only without use of a separate validation set. The algorithm has been applied to problems of optimizing the number of hidden neurons in a multilayer perceptron (MLP) and optimizing the number of learning epochs in MLP´s backpropagation training using both synthetic and benchmark data sets. The OAA algorithm can also be utilized in the optimization of other parameters of NNs. In addition, it can be applied to the problem of function approximation using any kind of basis functions, or to the problem of learning model selection when overfitting needs to be considered.
Keywords :
backpropagation; function approximation; multilayer perceptrons; optimisation; backpropagation training; goodness-of-fit; multilayer perceptron; neural network; optimized approximation algorithm; overfitting problem; signal-to-noise-ratio figure estimation; stopping criterion; Function approximation; neural network (NN) learning; overfitting; Algorithms; Computer Simulation; Neural Networks (Computer); Pattern Recognition, Automated; Signal Processing, Computer-Assisted;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2007.915114