Title :
Performance surfaces of a single-layer perceptron
Author_Institution :
Dept. of Electr. & Comput. Eng., California Univ., Santa Barbara, CA, USA
fDate :
9/1/1990 12:00:00 AM
Abstract :
A perceptron learning algorithm may be viewed as a steepest-descent method whereby an instantaneous performance function is iteratively minimized. An appropriate performance function for the most widely used perceptron algorithm is described and it is shown that the update term of the algorithm is the gradient of this function. An example is given of the corresponding performance surface based on Gaussian assumptions and it is shown that there is an infinity of stationary points. The performance surfaces of two related performance functions are examined. Computer simulations that demonstrate the convergence properties of the adaptive algorithms are given
Keywords :
convergence of numerical methods; iterative methods; learning systems; neural nets; Gaussian assumptions; adaptive algorithms; computer simulations; convergence properties; function gradient; instantaneous performance function; iterative minimization; perceptron learning algorithm; performance surfaces; single-layer perceptron; stationary points; steepest-descent method; update term; Convergence; H infinity control; Helium; Iterative algorithms; Least squares approximation; Multilayer perceptrons; Neural networks; Neurons; Quantization; Signal generators;
Journal_Title :
Neural Networks, IEEE Transactions on