DocumentCode :
1134090
Title :
A new approach to perceptron training
Author :
Eitzinger, Christian ; Plach, Hartwig
Author_Institution :
Protactor Res., Steyr, Austria
Volume :
14
Issue :
1
fYear :
2003
fDate :
1/1/2003 12:00:00 AM
Firstpage :
216
Lastpage :
221
Abstract :
The training of perceptrons is discussed in the framework of nonsmooth optimization. An investigation of Rosenblatt\´s perceptron training rule shows that convergence or the failure to converge in certain situations can be easily understood in this framework. An algorithm based on results from nonsmooth optimization is proposed and its relation to the "constrained steepest descent" method is investigated. Numerical experiments verify that the "constrained steepest descent" algorithm may be further improved by the integration of methods from nonsmooth optimization.
Keywords :
convergence; gradient methods; learning (artificial intelligence); optimisation; perceptrons; constrained steepest descent algorithm; convergence; nonsmooth optimization; perceptron training rule; Automatic control; Automation; Convergence; Linear programming; Neural networks; Optimization methods; Testing;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2002.806631
Filename :
1176141
Link To Document :
بازگشت