Title :
An improved backpropagation neural network learning
Author :
Stoyanov, Ivelin Peev
Author_Institution :
Inst. for Inf. Technol., Bulgarian Acad. of Sci., Sofia, Bulgaria
Abstract :
The backpropagation neural network (BPNN) is a well known and widely used mathematical model for pattern recognition, nonlinear function approximation, time series prediction, etc. There are many applications which require large input and hidden layers. In such cases, the learning process takes a long time. Many authors propose different methods to reduce the learning time, through convergence improvement. In the present report, a topological method is proposed to cope with this problem. The neurons whose weights tend toward constant values at the learning process are fixed and they are not learned till the end of the learning time. The neural network learning stops either when the error rate achieves an appropriate minimum, or when the learning time overcomes a constant value. Experiments demonstrate that this method decreases the learning time with about 50%
Keywords :
backpropagation; convergence; feedforward neural nets; network topology; pattern recognition; time series; backpropagation; convergence; feedforward neural network; hidden layers; learning time; pattern recognition; time series prediction; topology; Backpropagation; Convergence; Electronic mail; Error analysis; Function approximation; Information technology; Mathematical model; Neural networks; Neurons; Pattern recognition;
Conference_Titel :
Pattern Recognition, 1996., Proceedings of the 13th International Conference on
Conference_Location :
Vienna
Print_ISBN :
0-8186-7282-X
DOI :
10.1109/ICPR.1996.547632