No single measure

of the amount of information that a specific value

of a random variable

gives about another random variable

has all of the desirable properties possessed by Shannon\´s measure

of the average mutual information of

and

. It is shown that one of these properties (additivity) determines one particular form for

, while others (non-negativity or coordinate independance) determine a different form. The latter, which is the more useful and accepted information measure, is thus seen to be unique.