DocumentCode
909920
Title
The amount of information that y gives about X
Author
Blachman, Netson M.
Volume
14
Issue
1
fYear
1968
fDate
1/1/1968 12:00:00 AM
Firstpage
27
Lastpage
31
Abstract
No single measure
of the amount of information that a specific value
of a random variable
gives about another random variable
has all of the desirable properties possessed by Shannon\´s measure
of the average mutual information of
and
. It is shown that one of these properties (additivity) determines one particular form for
, while others (non-negativity or coordinate independance) determine a different form. The latter, which is the more useful and accepted information measure, is thus seen to be unique.
of the amount of information that a specific value
of a random variable
gives about another random variable
has all of the desirable properties possessed by Shannon\´s measure
of the average mutual information of
and
. It is shown that one of these properties (additivity) determines one particular form for
, while others (non-negativity or coordinate independance) determine a different form. The latter, which is the more useful and accepted information measure, is thus seen to be unique.Keywords
Information theory; Random variables; Coordinate measuring machines; Entropy; Helium; Marine vehicles; Mutual information; Particle measurements; Q measurement; Random variables;
fLanguage
English
Journal_Title
Information Theory, IEEE Transactions on
Publisher
ieee
ISSN
0018-9448
Type
jour
DOI
10.1109/TIT.1968.1054094
Filename
1054094
Link To Document