Title :
Anti-Hebbian rule for faster backpropagation learning
Author :
Abbas, Hazem M. ; Bayoumi, Mohamed M.
Author_Institution :
Dept. of Electr. Eng., Queen´´s Univ., Kingston, Ont., Canada
fDate :
27 Jun-2 Jul 1994
Abstract :
This paper introduces an algorithm to speed up the backpropagation learning rules. The algorithm is based on providing lateral connections among the neurons of every hidden layer. These connections are trained using an anti-Hebbian learning rule which decorrelates the outputs of these nodes. The decorrelation process minimizes any redundant information transferred from an internal layer to the next and therefore enables the network to capture the statistical properties of the mapping much faster. The algorithm is applied to some benchmark problems and the results are compared to those obtained using the conventional backpropagation network
Keywords :
backpropagation; convergence; correlation methods; network topology; neural nets; anti-Hebbian learning rule; decorrelation process; faster backpropagation learning; hidden layer; lateral neurons connections; mapping; network topology; neural nets; statistical properties; Acceleration; Backpropagation algorithms; Cost function; Decorrelation; Feedforward neural networks; Joining processes; Neural networks; Neurons; Statistics; Systems engineering and theory;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
DOI :
10.1109/ICNN.1994.374146