Title of article
Derivation of an amplitude of information in the setting of a new family of fractional entropies
Author/Authors
Guy Jumarie، نويسنده ,
Issue Information
روزنامه با شماره پیاپی سال 2012
Pages
25
From page
113
To page
137
Abstract
By generalizing the basic functional equation f(xy) = f(x) + f(y) in the form fβ(xy) = fβ(x) + fβ(y), β > 1, one can derive a family of solutions which are exactly the inverse of the Mittag–Leffler function, referred to as Mittag–Leffler logarithm, or logarithm of fractional order. This result provides a new family of generalized informational entropies which are indexed by a parameter clearly related to fractals, via fractional calculus, and which is quite relevant in the presence in defect of observation. The relation with Shannon’s entropy, Renyi’s entropy and Tsallis’ entropy is clarified, and it is shown that Tsallis’ generalized logarithm has a significance in terms of fractional calculus. The case β = 2 looks like directly relevant to amplitude of probability in quantum mechanics, and provides an approach to the definition of “amplitude of informational entropy”. One examines the kind of result one can so obtain in applying the maximum entropy principle. In the presence of uncertain definition (or fuzzy definition) the Mittag–Leffler function would be more relevant than the Gaussian normal law. To some extent, this new formulation could be fully supported by the derivation of a new family of fractional Fisher information.
Keywords
Fisher Information , Shannon entropy , Informational entropy , Generalized entropy , Mittag–Leffler function , fractional calculus
Journal title
Information Sciences
Serial Year
2012
Journal title
Information Sciences
Record number
1215240
Link To Document