DocumentCode :
1749085
Title :
Improving the performance of symmetric diffusion networks via biologically inspired constraints
Author :
Medler, David A. ; McClelland, James L.
Author_Institution :
Center for the Neural Basis of Cognition, Carnegie Mellon Univ., Pittsburgh, PA, USA
Volume :
1
fYear :
2001
fDate :
2001
Firstpage :
400
Abstract :
Symmetric diffusion networks (SDNs) are a class of networks based upon the principles of continuous, stochastic, adaptive, and interactive processing. SDNs are basically a continuous form of a Boltzmann machine trained with the contrastive Hebbian learning algorithm. Thus, one advantage SDNs have over standard backpropagation networks is that they can learn continuous probability distributions, that is, they can learn multiple distinct outputs for a single input. However, SDNs are difficult to train, especially on large training sets. In order to improve network learning performance, neurophysiologically inspired constraints were systematically imposed upon the networks. Results indicate that the application of such constraints dramatically increases the performance of SDNs in terms of their rate of learning, and in terms of their learning appropriate internal representations
Keywords :
Bayes methods; Boltzmann machines; Hebbian learning; probability; Bayes method; Boltzmann machine; Hebbian learning; constraints; probability distribution; symmetric diffusion networks; Algorithm design and analysis; Backpropagation algorithms; Bayesian methods; Cerebral cortex; Cognition; Hebbian theory; Neural networks; Probability distribution; Sampling methods; Stochastic processes;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-7044-9
Type :
conf
DOI :
10.1109/IJCNN.2001.939053
Filename :
939053
Link To Document :
بازگشت