DocumentCode
395536
Title
Resolution of singularities in mixture models and its stochastic complexity
Author
Yamazaki, Keisuke ; Watanabe, Sumio
Author_Institution
Dept. of Adv. Appl. Electron., Tokyo Inst. of Technol., Yokohama, Japan
Volume
3
fYear
2002
fDate
18-22 Nov. 2002
Firstpage
1355
Abstract
A learning machine which is a mixture of several distributions, for example, a Gaussian mixture or a mixture of experts, has a wide range of applications. However, such a machine is a non-identifiable statistical model with a lot of singularities in the parameter space, hence its generalization property is left unknown. Recently an algebraic geometrical method has been developed which enables us to treat such learning machines mathematically. Based on this method, this paper rigorously proves that a mixture learning machine has the smaller Bayesian stochastic complexity than regular statistical models. Since the generalization error of a learning machine is equal to the increase of the stochastic complexity, the result of this paper shows that the mixture model can attain the more precise prediction than regular statistical models if Bayesian estimation is applied in statistical inference.
Keywords
Bayes methods; Gaussian distribution; computational complexity; learning (artificial intelligence); probability; stochastic processes; Bayesian estimation; Bayesian stochastic complexity; Gaussian mixture; learning machine; mixture model; probability distributions; statistical inference; statistical model; Bayesian methods; Gaussian distribution; Information processing; Laboratories; Learning systems; Machine learning; Mathematical model; Predictive models; Probability distribution; Stochastic processes;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Information Processing, 2002. ICONIP '02. Proceedings of the 9th International Conference on
Print_ISBN
981-04-7524-1
Type
conf
DOI
10.1109/ICONIP.2002.1202842
Filename
1202842
Link To Document