DocumentCode :
3630830
Title :
Analysis of Single Perceptrons Learning Capabilities
Author :
Stefen Hui;Stanislaw H. Zak
Author_Institution :
Department of Mathematical Sciences, San Diego State University, San Diego, CA 92812
fYear :
1991
fDate :
6/1/1991 12:00:00 AM
Firstpage :
809
Lastpage :
814
Abstract :
This paper addresses the problem of supervised learning in two types of artificial neurons. They are: (1) an ADALINE (Adaptive Linear Element) with differentiable activation function (the McCulloch-Pitts type neuron), (ii) an adaline feeding a discrete dynamical system. Supervised learning occurs when the neuron is supplied with both the input and the correct output values. Learning algorithms are then used to adjust adaptable learning parameters, weights, based on the error of the computed output. We propose learning laws for both types of neurons. The proposed laws are based on the Widrow-Hoff learning algorithm. We then give sufficiency conditions under which the learning parameters converge, i.e. learning takes place. We also investigate conditions under which the learning parameters diverge.
Keywords :
"Neurons","Neural networks","Supervised learning","Convergence","Transfer functions","Adaptive algorithm","Pattern analysis"
Publisher :
ieee
Conference_Titel :
American Control Conference, 1991
Print_ISBN :
0-87942-565-2
Type :
conf
Filename :
4791485
Link To Document :
بازگشت