DocumentCode :
1264410
Title :
Performance surfaces of a single-layer perceptron
Author :
Shynk, John J.
Author_Institution :
Dept. of Electr. & Comput. Eng., California Univ., Santa Barbara, CA, USA
Volume :
1
Issue :
3
fYear :
1990
fDate :
9/1/1990 12:00:00 AM
Firstpage :
268
Lastpage :
274
Abstract :
A perceptron learning algorithm may be viewed as a steepest-descent method whereby an instantaneous performance function is iteratively minimized. An appropriate performance function for the most widely used perceptron algorithm is described and it is shown that the update term of the algorithm is the gradient of this function. An example is given of the corresponding performance surface based on Gaussian assumptions and it is shown that there is an infinity of stationary points. The performance surfaces of two related performance functions are examined. Computer simulations that demonstrate the convergence properties of the adaptive algorithms are given
Keywords :
convergence of numerical methods; iterative methods; learning systems; neural nets; Gaussian assumptions; adaptive algorithms; computer simulations; convergence properties; function gradient; instantaneous performance function; iterative minimization; perceptron learning algorithm; performance surfaces; single-layer perceptron; stationary points; steepest-descent method; update term; Convergence; H infinity control; Helium; Iterative algorithms; Least squares approximation; Multilayer perceptrons; Neural networks; Neurons; Quantization; Signal generators;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.80252
Filename :
80252
Link To Document :
بازگشت