DocumentCode :
2782916
Title :
Analysis and synthesis of neural networks using linear separation
Author :
Hokenek, Erdem
Author_Institution :
IBM Thomas J. Watson Res. Center, Yorktown Heights, NY, USA
fYear :
1990
fDate :
12-14 Aug 1990
Firstpage :
25
Abstract :
General analysis and synthesis methods for neural networks are presented. The techniques proposed are simple, efficient and not restricted to a certain network architecture, i.e., they can be any of the multilayer, fully interconnected feedforward or feedback structures. Based on the signs of connections between neurons (called weight signatures) being excitatory or inhibitory, the methods proposed provide some fundamental rules of learnability in such networks. Various design techniques are presented using these learning rules for the synthesis of neural architectures
Keywords :
feedback; learning systems; neural nets; excitatory; feedback structures; feedforward; fully interconnected; inhibitory; learnability; learning rules; linear separation; multilayer structures; network architecture; neural networks; neurons; weight signatures; Computer networks; Information analysis; Logic; Multi-layer neural network; Network synthesis; Neural networks; Neurofeedback; Neurons; Signal generators; Signal synthesis;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Circuits and Systems, 1990., Proceedings of the 33rd Midwest Symposium on
Conference_Location :
Calgary, Alta.
Print_ISBN :
0-7803-0081-5
Type :
conf
DOI :
10.1109/MWSCAS.1990.140643
Filename :
140643
Link To Document :
بازگشت