Title of article :
A new entropy upper bound
Author/Authors :
??pu?، نويسنده , , N. and Popescu، نويسنده , , P.G.، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2012
Abstract :
Entropy, conditional entropy and mutual information for discrete-valued random variables play important roles in information theory (see Ash (1965) [8] and Cover and Thomas (2006) [9]). Our purpose within this work is to present a strong upper bound for the classical Shannon entropy, refining recent results from the literature. For this purpose we have considered the work of Simic (2009) [4], where new entropy bounds based on a new refinement of Jensen’s inequality are presented. Our work improves the basic result of Simic through a stronger refinement of Jensen’s inequality which is then applied to information theory.
Keywords :
Jensen’s inequality , entropy , bounds , Refinements , Generalizations
Journal title :
Applied Mathematics Letters
Journal title :
Applied Mathematics Letters