Title :
Multilayer associative neural networks (MANN´s): storage capacity versus perfect recall
Author_Institution :
Dept. of Control & Instrum. Eng., Chungang Univ., Seoul, South Korea
fDate :
9/1/1994 12:00:00 AM
Abstract :
The objective of this paper is to to resolve important issues in artificial neural nets-exact recall and capacity in multilayer associative memories. These problems have imposed restrictions on coding strategies. We propose the following triple-layered hybrid neural network: the first synapse is a one-shot associative memory using the modified Kohonen´s adaptive learning algorithm with arbitrary input patterns; the second one is Kosko´s bidirectional associative memory consisting of orthogonal input/output basis vectors such as Walsh series satisfying the strict continuity condition; and finally, the third one is a simple one-shot associative memory with arbitrary output images. A mathematical framework based on the relationship between energy local minima (capacity of the neural net) and noise-free recall is established. The robust capacity conditions of this multilayer associative neural network that lead to forming the local minima of the energy function at the exact training pairs are derived. The chosen strategy not only maximizes the total number of stored images but also completely relaxes any code-dependent conditions of the learning pairs
Keywords :
adaptive systems; character recognition; content-addressable storage; feedforward neural nets; learning (artificial intelligence); Kohonen adaptive learning algorithm; Kosko bidirectional associative memory; Walsh series; character recognition; energy function; energy local minima; learning; multilayer associative memories; multilayer associative neural networks; noise-free recall; orthogonal input/output basis vectors; storage capacity; triple-layered hybrid neural network; Artificial neural networks; Associative memory; Magnesium compounds; Multi-layer neural network; Neural networks; Neurons; Noise robustness; Potential energy; Stability; Sufficient conditions;
Journal_Title :
Neural Networks, IEEE Transactions on