Title of article :
REYNF S ENTROPY AND SHANNON ENTROPY FOR MULTILAYER BACKPROPAGATION NEURAL NETWORKS: A COMPARATIVE STUDY
Author/Authors :
Rady, H. A. El-Shorouk Academy - Higher Institute for Computer Information Technology, Egypt
From page :
235
To page :
265
Abstract :
Entropy based criteria in adaptive systems is a hot area of research in the last years. Several principles were proposed based on the maximization or minimization of entropic cost functions. One way of entropy criteria in learning systems is to minimize the entropy of the error between two variables: typically one is the output of the learning system and the other is the target. In this paper, a comparison between Renyi s Entropy and Shannon Entropy for multilayer Backpropagation Neural Networks was proposed. The usual mean square error(MSE) minimization principle is substituted by the minimization of Renyi ys and Shannon Entropy of the differences between the multilayer perceptions output and the desired target. These two cost functions are studied, analyzed and tested with three different activation functions namely, the trigonometric (Sine) function, the hyperbolic tangent function, and the logistic activation function with different learning rates. The analytical approach indicates that the results are encourage and promising and that the Shannon entropy cost function speeds the convergence than Renyi fs Entropy cost function. But the degree of convergence using Renyi s Entropy is higher (better) than using Shannon Entropy
Keywords :
Entropy , Renyi s Entropy , Shannon Entropy , Activation Function , Learning Rate , Backpropagation Neural Network
Journal title :
International Journal of Intelligent Computing and Information Sciences
Journal title :
International Journal of Intelligent Computing and Information Sciences
Record number :
2565487
Link To Document :
بازگشت