Title of article
Measuring information content from observations for data assimilation: relative entropy versus shannon entropy difference
Author/Authors
QIN XU، نويسنده ,
Issue Information
روزنامه با شماره پیاپی سال 2007
Pages
12
From page
198
To page
209
Abstract
The relative entropy is compared with the previously used Shannon entropy difference as a measure of the amount of
information extracted from observations by an optimal analysis in terms of the changes in the probability density function
(pdf) produced by the analysis with respect to the background pdf. It is shown that the relative entropy measures both the
signal and dispersion parts of the information content from observations, while the Shannon entropy difference measures
only the dispersion part. When the pdfs are Gaussian or transformed to Gaussian, the signal part of the information
content is given by a weighted inner-product of the analysis increment vector and the dispersion part is given by a
non-negative definite function of the analysis and background covariance matrices. When the observation space is
transformed based on the singular value decomposition of the scaled observation operator, the information content
becomes separable between components associated with different singular values. Densely distributed observations can
be then compressed with minimum information loss by truncating the components associate with the smallest singular
values. The differences between the relative entropy and Shannon entropy difference in measuring information content
and information loss are analysed in details and illustrated by examples
Journal title
Tellus. Series A
Serial Year
2007
Journal title
Tellus. Series A
Record number
436632
Link To Document