DocumentCode :
3296472
Title :
On Kolmogorov-Nagumo averages and nonextensive entropy
Author :
Dukkipati, Ambedkar
Author_Institution :
Dept. of Comput. Sci. & Autom., Indian Inst. of Sci., Bangalore, India
fYear :
2010
fDate :
17-20 Oct. 2010
Firstpage :
446
Lastpage :
451
Abstract :
By replacing linear averaging in Shannon entropy with Kolmogorov-Nagumo average (KN-average) or quasilinear mean and further imposing the additivity constraint, Rényi proposed the first formal generalization of Shannon entropy. Using this recipe of Rényi, one can prepare only two information measures: Shannon and Rényi entropy. Indeed, using this formalism Rényi characterized these additive entropies in terms of axioms of quasilinear mean. As additivity is a characteristic property of Shannon entropy, pseudo-additivity of the form x ⊕q y = x + y + (1 - q)xy is a characteristic property of nonextensive (or Tsallis) entropy. One can apply Rényi´s recipe in the nonextensive case by replacing the linear averaging in Tsallis entropy with KN-average and thereby imposing the constraint of pseudo-additivity. In this paper we show that nonextensive entropy is unique under the Rényi´s recipe, and there by give a characterization.
Keywords :
entropy; KN-average; Kolmogorov-Nagumo averages; Rényi entropy; Shannon entropy; Tsallis entropy; additive entropy; additivity constraint; characteristic property; formal generalization; information measures; linear averaging; nonextensive entropy; pseudo-additivity; quasilinear mean; Additives; Context; Entropy; Equations; Information theory; Probability distribution; Random variables;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory and its Applications (ISITA), 2010 International Symposium on
Conference_Location :
Taichung
Print_ISBN :
978-1-4244-6016-8
Electronic_ISBN :
978-1-4244-6017-5
Type :
conf
DOI :
10.1109/ISITA.2010.5649354
Filename :
5649354
Link To Document :
بازگشت