DocumentCode :
1946590
Title :
Forced Information Maximization to Accelerate Information-Theoretic Competitive Learning
Author :
Kamimura, Ryotaro ; Kitajima, Ryozo
Author_Institution :
Tokai Univ., Kanagawa
fYear :
2007
fDate :
12-17 Aug. 2007
Firstpage :
1779
Lastpage :
1784
Abstract :
Information-theoretic competitive learning has been proved to be more general and more flexible type of competitive learning. However, one of the major shortcomings of this method is that it is sometimes very slow in learning. To overcome this problem, we introduce forced information used to force networks to increase information by supposing maximum information. We applied the method to an artificial data as well as a student survey. In both cases, we observed that information was very rapidly increased to stable points. Compared with results by the principal component analysis, our method showed more clearly the main features of input patterns. In addition, we can more easily explain a main mechanism of feature detection by forced information.
Keywords :
feature extraction; learning (artificial intelligence); principal component analysis; artificial data method; feature detection; forced information maximization; information-theoretic competitive learning; principal component analysis; Acceleration; Computer architecture; Computer vision; Data mining; Entropy; Feature extraction; Neural networks; Neurons; Principal component analysis; Uncertainty; competitive learning; dead neurons; forced information; information loss; mutual information maximization; winner-take-all;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2007. IJCNN 2007. International Joint Conference on
Conference_Location :
Orlando, FL
ISSN :
1098-7576
Print_ISBN :
978-1-4244-1379-9
Electronic_ISBN :
1098-7576
Type :
conf
DOI :
10.1109/IJCNN.2007.4371227
Filename :
4371227
Link To Document :
بازگشت