DocumentCode :
3499246
Title :
Closed-form cauchy-schwarz PDF divergence for mixture of Gaussians
Author :
Kampa, Kittipat ; Hasanbelliu, Erion ; Principe, Jose C.
Author_Institution :
Dept. of Electr. & Comput. Eng., Univ. of Florida, Gainesville, FL, USA
fYear :
2011
fDate :
July 31 2011-Aug. 5 2011
Firstpage :
2578
Lastpage :
2585
Abstract :
This paper presents an efficient approach to calculate the difference between two probability density functions (pdfs), each of which is a mixture of Gaussians (MoG). Unlike Kullback-Leibler divergence (DKL), the authors propose that the Cauchy-Schwarz (CS) pdf divergence measure (DCS) can give an analytic, closed-form expression for MoG. This property of the DCS makes fast and efficient calculations possible, which is tremendously desired in real-world applications where the dimensionality of the data/features is very high. We show that DCS follows similar trends to DKL, but can be computed much faster, especially when the dimensionality is high. Moreover, the proposed method is shown to significantly outperform DKL in classifying real-world 2D and 3D objects, and static hand posture recognition based on distances alone.
Keywords :
Gaussian processes; image classification; object recognition; probability; Gaussian mixture model; Kullback-Leibler divergence; MoG; PDF divergence; closed form Cauchy-Schwarz; closed-form expression; mixture of Gaussians; object classification; posture recognition; probability density functions; Accuracy; Closed-form solutions; Feature extraction; Nickel; Probability density function; Silicon; Three dimensional displays;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), The 2011 International Joint Conference on
Conference_Location :
San Jose, CA
ISSN :
2161-4393
Print_ISBN :
978-1-4244-9635-8
Type :
conf
DOI :
10.1109/IJCNN.2011.6033555
Filename :
6033555
Link To Document :
بازگشت