Title of article :
The use of random weights for the training of multilayer networks of neurons with Heaviside characteristics
Author/Authors :
Downs، نويسنده , , T. and Gaynier، نويسنده , , R.J.، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 1995
Pages :
9
From page :
53
To page :
61
Abstract :
Artificial neural networks have, in recent years, been very successfully applied in a wide range of areas. A major reason for this success has been the existence of a training algorithm called backpropagation. This algorithm relies upon the neural units in a network having input/output characteristics that are continuously differentiable. Such units are significantly less easy to implement in silicon than are neural units with Heaviside (step-function) characteristics. In this paper, we show how a training algorithm similar to backpropagation can be developed for 2-layer networks of Heaviside units by treating the network weights (i.e., interconnection strengths) as random variables. This is then used as a basis for the development of a training algorithm for networks with any number of layers by drawing upon the idea of internal representations. Some examples are given to illustrate the performance of these learning algorithms.
Keywords :
Network training , Heaviside activation functions , Random weights
Journal title :
Mathematical and Computer Modelling
Serial Year :
1995
Journal title :
Mathematical and Computer Modelling
Record number :
1590172
Link To Document :
بازگشت