DocumentCode :
1147624
Title :
Divergence Estimation of Continuous Distributions Based on Data-Dependent Partitions
Author :
Wang, Qing ; Kulkarni, Sanjeev R. ; Verdú, Sergio
Author_Institution :
Dept. of Electr. Eng., Princeton Univ., NJ, USA
Volume :
51
Issue :
9
fYear :
2005
Firstpage :
3064
Lastpage :
3074
Abstract :
We present a universal estimator of the divergence D(P,\\Vert ,Q) for two arbitrary continuous distributions P and Q satisfying certain regularity conditions. This algorithm, which observes independent and identically distributed (i.i.d.) samples from both P and Q , is based on the estimation of the Radon–Nikodym derivative  d P\\over d Q via a data-dependent partition of the observation space. Strong convergence of this estimator is proved with an empirically equivalent segmentation of the space. This basic estimator is further improved by adaptive partitioning schemes and by bias correction. The application of the algorithms to data with memory is also investigated. In the simulations, we compare our estimators with the direct plug-in estimator and estimators based on other partitioning approaches. Experimental results show that our methods achieve the best convergence performance in most of the tested cases.
Keywords :
information theory; probability; Radon-Nikodym derivative; adaptive partitioning schemes; arbitrary continuous distribution; bias correction; data-dependent partition; direct plug-in estimator; information measures; universal divergence estimator; Convergence; Density measurement; Entropy; Extraterrestrial measurements; Information theory; Mutual information; Partitioning algorithms; Pattern recognition; Random variables; Testing; Bias correction; Radon–Nikodym derivative; data-dependent partition; divergence; stationary and ergodic data; universal estimation of information measures;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2005.853314
Filename :
1499042
Link To Document :
بازگشت