• DocumentCode
    2265621
  • Title

    Universal estimation of divergence for continuous distributions via data-dependent partitions

  • Author

    Wang, Qing ; Kulkarni, Sanjeev R. ; Verdú, Sergio

  • Author_Institution
    Dept. of Electr. Eng., Princeton Univ., NJ
  • fYear
    2005
  • fDate
    4-9 Sept. 2005
  • Firstpage
    152
  • Lastpage
    156
  • Abstract
    We present a universal estimator of the divergence D(PparQ) for two arbitrary continuous distributions P and Q satisfying certain regularity conditions. This algorithm, which observes i.i.d. samples from both P and Q, is based on the estimation of the Radon-Nikodym derivative dP/dQ via a data-dependent partition of the observation space. Strong convergence of this estimator is proved with an empirically equivalent segmentation of the space. This basic estimator is further improved by adaptive partitioning schemes and by bias correction. In the simulations, we compare our estimators with the plug-in estimator and estimators based on other partitioning approaches. Experimental results show that our methods achieve the best convergence performance in most of the tested cases
  • Keywords
    entropy; parameter estimation; adaptive partitioning schemes; continuous distributions; data-dependent partitions; information divergence; universal estimation; Convergence; Entropy; Error probability; Information theory; Mutual information; Optimization methods; Partitioning algorithms; Q measurement; Random variables; Testing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Theory, 2005. ISIT 2005. Proceedings. International Symposium on
  • Conference_Location
    Adelaide, SA
  • Print_ISBN
    0-7803-9151-9
  • Type

    conf

  • DOI
    10.1109/ISIT.2005.1523312
  • Filename
    1523312