Title :
A proof of the Fisher information inequality via a data processing argument
Author_Institution :
Dept. of Electr. Eng. Syst., Tel Aviv Univ., Israel
fDate :
5/1/1998 12:00:00 AM
Abstract :
The Fisher information J(X) of a random variable X under a translation parameter appears in information theory in the classical proof of the entropy-power inequality (EPI). It enters the proof of the EPI via the De-Bruijn identity, where it measures the variation of the differential entropy under a Gaussian perturbation, and via the convolution inequality J(X+Y)-1⩾J(X)-1+J(Y) -1 (for independent X and Y), known as the Fisher information inequality (FII). The FII is proved in the literature directly, in a rather involved way. We give an alternative derivation of the FII, as a simple consequence of a “data processing inequality” for the Cramer-Rao lower bound on parameter estimation
Keywords :
Gaussian processes; convolution; entropy; parameter estimation; random processes; Cramer-Rao lower bound; De-Bruijn identity; Fisher information inequality; Gaussian perturbation; convolution inequality; data-processing inequality; differential entropy variation; entropy-power inequality; information theory; parameter estimation; random variable; translation parameter; Convolution; Covariance matrix; Cramer-Rao bounds; Data processing; Density measurement; Entropy; Information theory; Linear matrix inequalities; Mutual information; Random variables;
Journal_Title :
Information Theory, IEEE Transactions on