DocumentCode :
457103
Title :
Continuous Optimization based-on Boosting Gaussian Mixture Model
Author :
Li, Bin ; Zhong, Run-tian ; Wang, Xian-ji ; Zhuang, Zhen-quan
Author_Institution :
Nature Inspired Comput. & Applications Lab., Univ. of Sci. & Technol. of China, Hefei
Volume :
1
fYear :
0
fDate :
0-0 0
Firstpage :
1192
Lastpage :
1195
Abstract :
A new estimation of distribution algorithm (EDA) based on Gaussian mixture model (GMM) is proposed, in which boosting, an efficient ensemble learning method, is adopted to estimate GMM. By boosting simple GMM with two components, it has the ability of learning the model structure and parameters automatically without any requirement for prior knowledge. Moreover, since boosting can be viewed as a gradient search for a good fit of some objective in function space, the new EDA is time efficient. A set of experiments is implemented to evaluate the efficiency and performance of the new algorithm. The results show that, with a relatively smaller population and less number of generations, the new algorithm can perform as well as compared EDAs in optimizing multimodal functions
Keywords :
Gaussian distribution; Gaussian processes; gradient methods; learning (artificial intelligence); optimisation; search problems; GMM boosting; Gaussian mixture model; automatic parameter learning; continuous optimization; distribution algorithm estimation; ensemble learning; function space; gradient search; model structure learning; multimodal function optimization; Boosting; Clustering algorithms; Clustering methods; Computer applications; Distributed computing; Electronic design automation and methodology; Iterative algorithms; Laboratories; Learning systems; Probability density function;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pattern Recognition, 2006. ICPR 2006. 18th International Conference on
Conference_Location :
Hong Kong
ISSN :
1051-4651
Print_ISBN :
0-7695-2521-0
Type :
conf
DOI :
10.1109/ICPR.2006.412
Filename :
1699103
Link To Document :
بازگشت