DocumentCode :
3322225
Title :
Neural ´selective´ processing and learning
Author :
Gelband, Patrice ; Tse, Edison
Author_Institution :
Adv. Decision Syst., Mountain View, CA, USA
fYear :
1988
fDate :
24-27 July 1988
Firstpage :
417
Abstract :
The authors show that by generalizing the threshold logic function to a multibandpass or ´selective´ function, multilayer networks are effectively reduced to a simple layer. As a result, learning is rapid and unimpeded by local minima. In addition, their learning algorithm introduces selective function parameters adaptively. The learning algorithm has two stages. In the first stage, the values of the connections are chosen so that the family of hyperplanes net/sub j/=c through the input space is oriented with the input data. For N-bit inputs, this is an O(N) process which requires only a single pass through the input words. In the second stage, the thresholds are introduced to appropriately segment the input space. This requires O(ln/sub 2/M) passes through M input words. It is shown that the selective network does not in general attain the information-theoretic limits on storage capacity. However, the selective network requires significantly less hardware than other analytic models of memory when the data is oriented, and fewer connections (although more nodes) when the data is sparse.<>
Keywords :
adaptive systems; artificial intelligence; learning systems; neural nets; artificial intelligence; learning algorithm; local minima; multiband pass function; multilayer neural networks; selective processing; threshold logic; Adaptive systems; Artificial intelligence; Learning systems; Neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1988., IEEE International Conference on
Conference_Location :
San Diego, CA, USA
Type :
conf
DOI :
10.1109/ICNN.1988.23874
Filename :
23874
Link To Document :
بازگشت