DocumentCode :
3373286
Title :
Directed random search for multiple layer perceptron training
Author :
Seiffert, Udo ; Michaelis, Bernd
Author_Institution :
Inst. for Electron. Signal Process. & Commun., Univ. of Magdeburg, Germany
fYear :
2001
fDate :
2001
Firstpage :
193
Lastpage :
202
Abstract :
Although backpropagation (BP) is commonly used to train multiple layer perceptron (MLP) neural networks and its original algorithm has been significantly improved several times, it still suffers from some drawbacks like being slow, getting stuck in local minima or being bound to constraints regarding the activation (transfer) function of the neurons. The paper presents the substitution of backpropagation with a random search technique which has been enhanced by a directed component. By means of some benchmark problems, a case study shows general potential application fields as well as advantages and disadvantages of both the backpropagation and the directed random search (DRS)
Keywords :
backpropagation; multilayer perceptrons; search problems; DRS; MLP neural networks; activation function; backpropagation; case study; directed component; directed random search; local minima; multiple layer perceptron neural networks; multiple layer perceptron training; potential application fields; random search technique; Artificial neural networks; Backpropagation algorithms; Biological system modeling; Biology computing; Computational modeling; Error correction; Neural networks; Neurons; Root mean square; Signal processing algorithms;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks for Signal Processing XI, 2001. Proceedings of the 2001 IEEE Signal Processing Society Workshop
Conference_Location :
North Falmouth, MA
ISSN :
1089-3555
Print_ISBN :
0-7803-7196-8
Type :
conf
DOI :
10.1109/NNSP.2001.943124
Filename :
943124
Link To Document :
بازگشت