DocumentCode :
1551429
Title :
Global Boltzmann perceptron network for online learning of conditional distributions
Author :
Thathachar, M. A L ; Arvind, M.T.
Author_Institution :
Dept. of Electr. Eng., Indian Inst. of Sci., Bangalore, India
Volume :
10
Issue :
5
fYear :
1999
fDate :
9/1/1999 12:00:00 AM
Firstpage :
1090
Lastpage :
1098
Abstract :
This paper proposes a backpropagation-based feedforward neural network for learning probability distributions of outputs conditioned on inputs using incoming input-output samples only. The backpropagation procedure is shown to locally minimize the Kullback-Leibler measure in an expected sense. The procedure is enhanced to facilitate boundedness of weights and exploration of the search space to reach a global minimum. The weak convergence theory is employed to show that the long-term behavior of the resulting algorithm can be approximated by that of a stochastic differential equation, whose invariant distributions are concentrated around the global minima of the Kullback-Leibler measure within a region of interest. Simulation studies on problems involving samples arriving from a mixture of labeled densities and the well-known Iris data problem demonstrate the speed and accuracy of the proposed procedure
Keywords :
Boltzmann machines; backpropagation; convergence of numerical methods; feedforward neural nets; probability; real-time systems; Boltzmann perceptron network; Kullback-Leibler measure; backpropagation; conditional distributions; feedforward neural network; mixture density; online learning; probability distributions; search space; stochastic differential equation; weak convergence; Backpropagation algorithms; Convergence; Differential equations; Diseases; Entropy; Feedforward neural networks; Iris; Neural networks; Probability distribution; Stochastic processes;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.788649
Filename :
788649
Link To Document :
بازگشت