DocumentCode
1400550
Title
A new synthesis approach for feedback neural networks based on the perceptron training algorithm
Author
Liu, Derong ; Lu, Zanjun
Author_Institution
Dept. of Electr. Eng. & Comput. Sci., Stevens Inst. of Technol., Hoboken, NJ, USA
Volume
8
Issue
6
fYear
1997
fDate
11/1/1997 12:00:00 AM
Firstpage
1468
Lastpage
1482
Abstract
In this paper, a new synthesis approach is developed for associative memories based on the perceptron training algorithm. The design (synthesis) problem of feedback neural networks for associative memories is formulated as a set of linear inequalities such that the use of perceptron training is evident. The perceptron training in the synthesis algorithms is guaranteed to converge for the design of neural networks without any constraints on the connection matrix. For neural networks with constraints on the diagonal elements of the connection matrix, results concerning the properties of such networks and concerning the existence of such a network design are established. For neural networks with sparsity and/or symmetry constraints on the connection matrix, design algorithms are presented. Applications of the present synthesis approach to the design of associative memories realized by means of other feedback neural network models are studied. To demonstrate the applicability of the present results and to compare the present synthesis approach with existing design methods, specific examples are considered
Keywords
associative processing; content-addressable storage; network synthesis; neural nets; associative memory; connection matrix; feedback neural networks; perceptron training; Algorithm design and analysis; Associative memory; Design methodology; Information retrieval; Linear matrix inequalities; Network synthesis; Neural networks; Neurofeedback; Nonlinear dynamical systems; Prototypes;
fLanguage
English
Journal_Title
Neural Networks, IEEE Transactions on
Publisher
ieee
ISSN
1045-9227
Type
jour
DOI
10.1109/72.641469
Filename
641469
Link To Document