Title of article :
Some upper bounds for relative entropy and applications
Author/Authors :
S. S. Dragomir، نويسنده , , M. L. Scholz، نويسنده , , J. Sunde، نويسنده ,
Issue Information :
دوهفته نامه با شماره پیاپی سال 2000
Pages :
10
From page :
91
To page :
100
Abstract :
In this paper, we derive some upper bounds for the relative entropy D(p q) of two probability distribution and apply them to mutual information and entropy mapping. To achieve this, we use an inequality for logarithm function, (2.3) below, and some classical inequalities such as the Kantorovi Inequality and Diaz-Metcalf Inequality.
Keywords :
Relative entropy , Diaz-Metcalf inequality , Kantorovi? inequality , Log-mapping , Mutual information
Journal title :
Computers and Mathematics with Applications
Serial Year :
2000
Journal title :
Computers and Mathematics with Applications
Record number :
918703
Link To Document :
بازگشت