Title :
Estimation of Nonlinear Functionals of Densities With Confidence
Author :
Sricharan, Kumar ; Raich, Raviv ; Hero, Alfred O., III
Author_Institution :
Dept. of Electr. & Comput. Eng., Univ. of Michigan, Ann Arbor, MI, USA
fDate :
7/1/2012 12:00:00 AM
Abstract :
This paper introduces a class of k-nearest neighbor (k-NN) estimators called bipartite plug-in (BPI) estimators for estimating integrals of nonlinear functions of a probability density, such as Shannon entropy and Rényi entropy. The density is assumed to be smooth, have bounded support, and be uniformly bounded from below on this set. Unlike previous k-NN estimators of nonlinear density functionals, the proposed estimator uses data-splitting and boundary correction to achieve lower mean square error. Specifically, we assume that T i.i.d. samples Xi ϵ Rd from the density are split into two pieces of cardinality M and N, respectively, with M samples used for computing a k-NN density estimate and the remaining N samples used for empirical estimation of the integral of the density functional. By studying the statistical properties of k-NN balls, explicit rates for the bias and variance of the BPI estimator are derived in terms of the sample size, the dimension of the samples, and the underlying probability distribution. Based on these results, it is possible to specify optimal choice of tuning parameters M/T, k for maximizing the rate of decrease of the mean square error. The resultant optimized BPI estimator converges faster and achieves lower mean squared error than previous k-NN entropy estimators. In addition, a central limit theorem is established for the BPI estimator that allows us to specify tight asymptotic confidence intervals.
Keywords :
entropy; information theory; mean square error methods; nonlinear functions; probability; BPI estimator; Renyi entropy; Shannon entropy; bipartite plug-in estimator; boundary correction; data-splitting; k-NN estimator; k-nearest neighbor estimators; mean square error; nonlinear density functional estimation; probability density; probability distribution; statistical properties; tuning parameters; Approximation methods; Convergence; Entropy; Estimation; Kernel; Silicon; Tuning; Adaptive estimators; bias and variance tradeoff; bipartite $k$-nearest neighbor ( ${k}$-NN) graphs; concentration bounds; convergence rates; data-splitting estimators; entropy estimation;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2012.2195549