DocumentCode
351010
Title
Neural networks with periodic and monotonic activation functions: a comparative study in classification problems
Author
Sopena, Josep M. ; Romero, Enrique ; Alquézar, René
Author_Institution
Lab. Neurocomput., Barcelona Univ., Spain
Volume
1
fYear
1999
fDate
1999
Firstpage
323
Abstract
This article discusses a number of reasons why the use of nonmonotonic functions as activation functions can lead to a marked improvement in the performance of a neural network. Using a wide range of benchmarks we show that a multilayer feedforward network using sine activation functions (and an appropriate choice of initial parameters) learns much faster than one incorporating sigmoid functions-as much as 150-500 times faster when both types are trained with backpropagation. Learning speed also compares favorably with speeds reported using modified versions of the backpropagation algorithm. In addition, the computational and generalization capacity also increases
Keywords
feedforward neural nets; feedforward neural network; generalization; learning algorithm; monotonic activation functions; neural network; pattern classification; periodic activation functions; sine activation functions;
fLanguage
English
Publisher
iet
Conference_Titel
Artificial Neural Networks, 1999. ICANN 99. Ninth International Conference on (Conf. Publ. No. 470)
Conference_Location
Edinburgh
ISSN
0537-9989
Print_ISBN
0-85296-721-7
Type
conf
DOI
10.1049/cp:19991129
Filename
819741
Link To Document