Title :
The layer-wise method and the backpropagation hybrid approach to learning a feedforward neural network
Author :
Rubanov, Nickolai S.
Author_Institution :
Dept. of Radiophys., Byelorussian State Univ., Minsk, Byelorussia
fDate :
3/1/2000 12:00:00 AM
Abstract :
Feedforward neural networks (FNNs) have been proposed to solve complex problems in pattern recognition and classification and function approximation. Despite the general success of learning methods for FNNs, such as the backpropagation (BP) algorithm, second-order optimization algorithms and layer-wise learning algorithms, several drawbacks remain to be overcome. In particular, two major drawbacks are convergence to a local minima and long learning time. We propose an efficient learning method for a FNN that combines the BP strategy and optimization layer by layer. More precisely, we construct the layer-wise optimization method using the Taylor series expansion of nonlinear operators describing a FNN and propose to update weights of each layer by the BP-based Kaczmarz iterative procedure. The experimental results show that the new learning algorithm is stable, it reduces the learning time and demonstrates improvement of generalization results in comparison with other well-known methods
Keywords :
backpropagation; feedforward neural nets; function approximation; optimisation; pattern recognition; Taylor series expansion; backpropagation-based Kaczmarz iterative procedure; complex problems; layer-wise method; long learning time; nonlinear operators; second-order optimization algorithms; Backpropagation algorithms; Convergence; Feedforward neural networks; Function approximation; Iterative methods; Learning systems; Neural networks; Optimization methods; Pattern recognition; Taylor series;
Journal_Title :
Neural Networks, IEEE Transactions on