DocumentCode :
1539475
Title :
Bayesian self-organising map for Gaussian mixtures
Author :
Yin, H. ; Allinson, N.M.
Author_Institution :
Dept. of Electr. Eng. & Electron., Univ. of Manchester Inst. of Sci. & Technol., UK
Volume :
148
Issue :
4
fYear :
2001
fDate :
8/1/2001 12:00:00 AM
Firstpage :
234
Lastpage :
240
Abstract :
A Bayesian self-organising map (BSOM) is proposed for learning mixtures of Gaussian distributions. It is derived naturally from minimising the Kullback-Leibler (1951) divergence between the data density and the neural model. The inferred posterior probabilities of the neurons replace the common Euclidean distance winning rule and define explicitly the neighbourhood function. Learning can be retained in a small but fixed neighbourhood of the winner. The BSOM in turn provides an insight into the role of neighbourhood functions used in the common SOM. A formal comparison between the BSOM and the expectation-maximisation (EM) algorithm is also presented, together with experimental results
Keywords :
Gaussian distribution; belief networks; image classification; maximum likelihood estimation; optimisation; pattern recognition; self-organising feature maps; unsupervised learning; Bayesian self-organising map; Gaussian distribution; Gaussian mixtures; Kohonen self-organising map; Kullback-Leibler divergence minimisation; MLE; X-ray diffraction image; clustering; data density; expectation-maximisation algorithm; maximum likelihood estimation; neighbourhood function; neural model; pattern recognition; posterior probabilities; unsupervised classification; unsupervised learning algorithm;
fLanguage :
English
Journal_Title :
Vision, Image and Signal Processing, IEE Proceedings -
Publisher :
iet
ISSN :
1350-245X
Type :
jour
DOI :
10.1049/ip-vis:20010378
Filename :
955429
Link To Document :
بازگشت