DocumentCode :
2518474
Title :
Kullback-Leibler divergence estimation of continuous distributions
Author :
Perez-Cruz, Fernando
Author_Institution :
Dept. of Electr. Eng., Princeton Univ., Princeton, NJ
fYear :
2008
fDate :
6-11 July 2008
Firstpage :
1666
Lastpage :
1670
Abstract :
We present a method for estimating the KL divergence between continuous densities and we prove it converges almost surely. Divergence estimation is typically solved estimating the densities first. Our main result shows this intermediate step is unnecessary and that the divergence can be either estimated using the empirical cdf or k-nearest-neighbour density estimation, which does not converge to the true measure for finite k. The convergence proof is based on describing the statistics of our estimator using waiting-times distributions, as the exponential or Erlang. We illustrate the proposed estimators and show how they compare to existing methods based on density estimation, and we also outline how our divergence estimators can be used for solving the two-sample problem.
Keywords :
information theory; Kullback-Leibler divergence estimation; density estimation; k-nearest-neighbour density estimation; waiting-times distributions; Convergence; Density measurement; Entropy; Frequency estimation; H infinity control; Machine learning; Mutual information; Neuroscience; Random variables; Statistical distributions;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 2008. ISIT 2008. IEEE International Symposium on
Conference_Location :
Toronto, ON
Print_ISBN :
978-1-4244-2256-2
Electronic_ISBN :
978-1-4244-2257-9
Type :
conf
DOI :
10.1109/ISIT.2008.4595271
Filename :
4595271
Link To Document :
بازگشت