Title :
Analysis and synthesis of neural networks using linear separation
Author_Institution :
IBM Thomas J. Watson Res. Center, Yorktown Heights, NY, USA
Abstract :
General analysis and synthesis methods for neural networks are presented. The techniques proposed are simple, efficient and not restricted to a certain network architecture, i.e., they can be any of the multilayer, fully interconnected feedforward or feedback structures. Based on the signs of connections between neurons (called weight signatures) being excitatory or inhibitory, the methods proposed provide some fundamental rules of learnability in such networks. Various design techniques are presented using these learning rules for the synthesis of neural architectures
Keywords :
feedback; learning systems; neural nets; excitatory; feedback structures; feedforward; fully interconnected; inhibitory; learnability; learning rules; linear separation; multilayer structures; network architecture; neural networks; neurons; weight signatures; Computer networks; Information analysis; Logic; Multi-layer neural network; Network synthesis; Neural networks; Neurofeedback; Neurons; Signal generators; Signal synthesis;
Conference_Titel :
Circuits and Systems, 1990., Proceedings of the 33rd Midwest Symposium on
Conference_Location :
Calgary, Alta.
Print_ISBN :
0-7803-0081-5
DOI :
10.1109/MWSCAS.1990.140643