DocumentCode :
1577944
Title :
Artificial neural network with complex weight and its training
Author :
Shin, Yong-Chul ; Sridhar, Ramalingam
Author_Institution :
Dept. of Electr. & Comput. Eng., State Univ. of New York, Buffalo, NY, USA
fYear :
1992
Firstpage :
354
Abstract :
Artificial neural networks that use complex weights for the synaptic connections are presented. It is shown that the use of complex weights overcomes linear nonseparability for functions such as exclusive-OR and hence can be implemented using a single-layer network. The authors also present a modification to the backpropagation method to train the neural network presented. Several examples including symmetry problems, summation, and negation are presented to demonstrate the effectiveness of the use of complex weights. It is expected that this approach can implement functions of greater complexity using simpler networks (with fewer layers) than would be required with conventional approaches
Keywords :
backpropagation; neural nets; backpropagation; complex weight; linear nonseparability; negation; neural networks; summation; symmetry problems; synaptic connections; Adaptive systems; Artificial neural networks; Biological neural networks; Brain modeling; Cognition; Computer networks; Humans; Nervous system; Neurons; Pattern matching;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neuroinformatics and Neurocomputers, 1992., RNNS/IEEE Symposium on
Conference_Location :
Rostov-on-Don
Print_ISBN :
0-7803-0809-3
Type :
conf
DOI :
10.1109/RNNS.1992.268552
Filename :
268552
Link To Document :
بازگشت