DocumentCode :
1818461
Title :
Bayesian ying-yang supervised learning, modular models, and three layer nets
Author :
Xu, Lei
Author_Institution :
Dept. of Comput. Sci. & Eng., Chinese Univ. of Hong Kong, Shatin, Hong Kong
Volume :
1
fYear :
1999
fDate :
1999
Firstpage :
540
Abstract :
Bayesian ying-yang (BYY) supervised learning system and theory is further re-elaborated, and the previous results of its uses on mixture-of-expert models, radial basis functions and three layer nets are systematically summarized. Moreover, new results on three layer net are presented. Using Taylor expansion on the distribution of the output layer, we find that maximum likelihood (ML) learning on a net with a probabilistic hidden layer is equivalent to adding a regularization to its counterpart with a deterministic hidden layer, which leads us not only an adaptive EM-like algorithm for ML learning on three layer net, but also a new type of regularization technique. Furthermore, an improved BYY criterion is obtained for selecting the number of hidden units
Keywords :
Bayes methods; feedforward neural nets; learning (artificial intelligence); maximum likelihood estimation; probability; radial basis function networks; Bayesian ying-yang; Taylor expansion; maximum likelihood learning; mixture-of-expert models; multilayer neural nets; probabilistic hidden layer; supervised learning; Bayesian methods; Computer science; Inverse problems; Learning systems; Maximum likelihood estimation; Smoothing methods; Statistical learning; Supervised learning; Taylor series; Unsupervised learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-5529-6
Type :
conf
DOI :
10.1109/IJCNN.1999.831555
Filename :
831555
Link To Document :
بازگشت