DocumentCode
1843802
Title
Distributed ARTMAP
Author
Carpenter, Gail A. ; Milenova, Boriana L.
Author_Institution
Dept. of Cognitive & Neural Syst., Boston Univ., MA, USA
Volume
3
fYear
1999
fDate
1999
Firstpage
1983
Abstract
Distributed coding at the hidden layer of a multilayer perceptron (MLP) endows the network with memory compression and noise tolerance capabilities. However, an MLP typically requires slow off-line learning to avoid catastrophic forgetting in an open input environment. An adaptive resonance theory (ART) model is designed to guarantee stable memories even with fast online learning. However, ART stability typically requires winner-take-all coding, which may cause category proliferation in a noisy input environment. Distributed ARTMAP (dARTMAP) seeks to combine the computational advantages of MLP and ART systems in a real-time neural network for supervised learning. This system incorporates elements of the unsupervised dART model as well as new features, including a content-addressable memory rule. Simulations show that dARTMAP retains fuzzy ARTMAP accuracy while significantly improving memory compression. The model´s computational learning rules correspond to paradoxical cortical data
Keywords
ART neural nets; data compression; encoding; fuzzy neural nets; learning (artificial intelligence); multilayer perceptrons; ART model; adaptive resonance theory; content-addressable memory; distributed ARTMAP; distributed coding; fuzzy ARTMAP; memory compression; multilayer perceptron; neural network; online learning; supervised learning; Computational modeling; Computer networks; Distributed computing; Multilayer perceptrons; Neural networks; Real time systems; Resonance; Stability; Subspace constraints; Working environment noise;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location
Washington, DC
ISSN
1098-7576
Print_ISBN
0-7803-5529-6
Type
conf
DOI
10.1109/IJCNN.1999.832688
Filename
832688
Link To Document