DocumentCode :
2695648
Title :
Multiple descent cost competitive learning: batch and successive self-organization with excitatory and inhibitory connections
Author :
Matsuyama, Yasuo
fYear :
1990
fDate :
17-21 June 1990
Firstpage :
299
Abstract :
Novel general algorithms for multiple-descent cost-competitive learning are presented. These algorithms self-organize neural networks and possess the following features: optimal grouping of applied vector inputs, product form of neurons, neural topologies, excitatory and inhibitory connections, fair competitive bias, oblivion, winner-take-quota rule, stochastic update, and applicability to a wide class of costs. Both batch and successive training algorithms are given. Each type has its own merits. However, these two classes are equivalent, since a problem solved in the batch mode can be computed successively, and vice versa. The algorithms cover a class of combinatorial optimizations besides traditional standard pattern set design
Keywords :
learning systems; neural nets; applied vector inputs; combinatorial optimizations; cost-competitive learning; excitatory connections; fair competitive bias; inhibitory connections; multiple-descent; neural networks; neural topologies; oblivion; optimal grouping; pattern set design; product form; stochastic update; successive self-organization; successive training algorithms; winner-take-quota rule;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
Type :
conf
DOI :
10.1109/IJCNN.1990.137730
Filename :
5726689
Link To Document :
بازگشت