Title :
Improving the performance of symmetric diffusion networks via biologically inspired constraints
Author :
Medler, David A. ; McClelland, James L.
Author_Institution :
Center for the Neural Basis of Cognition, Carnegie Mellon Univ., Pittsburgh, PA, USA
Abstract :
Symmetric diffusion networks (SDNs) are a class of networks based upon the principles of continuous, stochastic, adaptive, and interactive processing. SDNs are basically a continuous form of a Boltzmann machine trained with the contrastive Hebbian learning algorithm. Thus, one advantage SDNs have over standard backpropagation networks is that they can learn continuous probability distributions, that is, they can learn multiple distinct outputs for a single input. However, SDNs are difficult to train, especially on large training sets. In order to improve network learning performance, neurophysiologically inspired constraints were systematically imposed upon the networks. Results indicate that the application of such constraints dramatically increases the performance of SDNs in terms of their rate of learning, and in terms of their learning appropriate internal representations
Keywords :
Bayes methods; Boltzmann machines; Hebbian learning; probability; Bayes method; Boltzmann machine; Hebbian learning; constraints; probability distribution; symmetric diffusion networks; Algorithm design and analysis; Backpropagation algorithms; Bayesian methods; Cerebral cortex; Cognition; Hebbian theory; Neural networks; Probability distribution; Sampling methods; Stochastic processes;
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-7044-9
DOI :
10.1109/IJCNN.2001.939053