DocumentCode
1423217
Title
Simplifying Mixture Models Through Function Approximation
Author
Zhang, Kai ; Kwok, James T.
Author_Institution
Life Sci. Div., Lawrence Berkeley Nat. Lab., Berkeley, CA, USA
Volume
21
Issue
4
fYear
2010
fDate
4/1/2010 12:00:00 AM
Firstpage
644
Lastpage
658
Abstract
The finite mixture model is widely used in various statistical learning problems. However, the model obtained may contain a large number of components, making it inefficient in practical applications. In this paper, we propose to simplify the mixture model by minimizing an upper bound of the approximation error between the original and the simplified model, under the use of the L 2 distance measure. This is achieved by first grouping similar components together and then performing local fitting through function approximation. The simplified model obtained can then be used as a replacement of the original model to speed up various algorithms involving mixture models during training (e.g., Bayesian filtering, belief propagation) and testing [e.g., kernel density estimation, support vector machine (SVM) testing]. Encouraging results are observed in the experiments on density estimation, clustering-based image segmentation, and simplification of SVM decision functions.
Keywords
data analysis; function approximation; pattern clustering; support vector machines; Function Approximation; SVM decision functions simplification; clustering based image segmentation; density estimation; finite mixture model; function approximation; mixture models simplification; statistical learning problems; Clustering; mixture models; support vector machine (SVM) testing; Algorithms; Artificial Intelligence; Cluster Analysis; Computer Simulation; Humans; Image Processing, Computer-Assisted; Learning; Models, Statistical; Pattern Recognition, Automated; Probability; Time Factors;
fLanguage
English
Journal_Title
Neural Networks, IEEE Transactions on
Publisher
ieee
ISSN
1045-9227
Type
jour
DOI
10.1109/TNN.2010.2040835
Filename
5418870
Link To Document