Title of article :
Performance Analysis of Various Activation Functions in Generalized MLP Architectures of Neural Networks
Author/Authors :
Bekir Karlik، نويسنده , , A Vehbi Olgac، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2010
Pages :
12
From page :
111
To page :
122
Abstract :
The activation function used to transform the activation level of a unit (neuron) into an outputsignal. There are a number of common activation functions in use with artificial neural networks (ANN). The most common choice of activation functions for multi layered perceptron (MLP) isused as transfer functions in research and engineering. Among the reasons for this popularity areits boundedness in the unit interval, the function’s and its derivative’s fast computability, and anumber of amenable mathematical properties in the realm of approximation theory. However, considering the huge variety of problem domains MLP is applied in, it is intriguing to suspect thatspecific problems call for single or a set of specific activation functions. The aim of this study is toanalyze the performance of generalized MLP architectures which has back-propagation algorithmusing various different activation functions for the neurons of hidden and output layers. Forexperimental comparisons, Bi-polar sigmoid, Uni-polar sigmoid, Tanh, Conic Section, and RadialBases Function (RBF) were used
Keywords :
Neural networks , Performance analysis , Activation functions , Multi Layered Perceptron
Journal title :
International Journal of Artificial Intelligence and Expert Systems
Serial Year :
2010
Journal title :
International Journal of Artificial Intelligence and Expert Systems
Record number :
668747
Link To Document :
بازگشت