Title :
Resolution of singularities in mixture models and its stochastic complexity
Author :
Yamazaki, Keisuke ; Watanabe, Sumio
Author_Institution :
Dept. of Adv. Appl. Electron., Tokyo Inst. of Technol., Yokohama, Japan
Abstract :
A learning machine which is a mixture of several distributions, for example, a Gaussian mixture or a mixture of experts, has a wide range of applications. However, such a machine is a non-identifiable statistical model with a lot of singularities in the parameter space, hence its generalization property is left unknown. Recently an algebraic geometrical method has been developed which enables us to treat such learning machines mathematically. Based on this method, this paper rigorously proves that a mixture learning machine has the smaller Bayesian stochastic complexity than regular statistical models. Since the generalization error of a learning machine is equal to the increase of the stochastic complexity, the result of this paper shows that the mixture model can attain the more precise prediction than regular statistical models if Bayesian estimation is applied in statistical inference.
Keywords :
Bayes methods; Gaussian distribution; computational complexity; learning (artificial intelligence); probability; stochastic processes; Bayesian estimation; Bayesian stochastic complexity; Gaussian mixture; learning machine; mixture model; probability distributions; statistical inference; statistical model; Bayesian methods; Gaussian distribution; Information processing; Laboratories; Learning systems; Machine learning; Mathematical model; Predictive models; Probability distribution; Stochastic processes;
Conference_Titel :
Neural Information Processing, 2002. ICONIP '02. Proceedings of the 9th International Conference on
Print_ISBN :
981-04-7524-1
DOI :
10.1109/ICONIP.2002.1202842