Title :
Closed-form information-theoretic divergences for statistical mixtures
Abstract :
Statistical mixtures such as Rayleigh, Wishart or Gaussian mixture models are commonly used in pattern recognition and signal processing tasks. Since the Kullback-Leibler divergence between any two such mixture models does not admit an analytical expression, the relative entropy can only be approximated numerically using time-consuming Monte-Carlo stochastic sampling. This drawback has motivated the quest for alternative information-theoretic divergences such as the recent Jensen-Rényi, Cauchy-Schwarz, or total square loss divergences that bypass the numerical approximations by providing exact analytic expressions. In this paper, we state sufficient conditions on the mixture distribution family so that these novel non-KL statistical divergences between any two such mixtures can be expressed in generic closed-form formulas.
Keywords :
Monte Carlo methods; approximation theory; entropy; pattern recognition; sampling methods; statistical distributions; stochastic processes; Kullback-Leibler divergence; Monte Carlo stochastic sampling; information theory divergence; nonKL statistical divergence; numerical approximation; pattern recognition; relative entropy; signal processing; statistical mixture distribution; sufficient conditions; Closed-form solutions; Entropy; Gaussian mixture model; Laplace equations; Monte Carlo methods; Shape;
Conference_Titel :
Pattern Recognition (ICPR), 2012 21st International Conference on
Conference_Location :
Tsukuba
Print_ISBN :
978-1-4673-2216-4