DocumentCode :
352481
Title :
Context quantization and contextual self-organizing maps
Author :
Voegtlin, Thomas
Author_Institution :
Inst. des Sci. Cognitives, CNRS, Bron, France
Volume :
6
fYear :
2000
fDate :
2000
Firstpage :
20
Abstract :
Vector quantization consists in finding a discrete approximation of a continuous input. One of the most popular neural algorithms related to vector quantization is the, so called, Kohonen map. We generalize vector quantization to temporal data, introducing context quantization. We propose a recurrent network inspired by the Kohonen map, the contextual self-organizing map, that develops near-optimal representations of context. We demonstrate quantitatively that this algorithm shows better performance than the other neural methods proposed so far
Keywords :
recurrent neural nets; self-organising feature maps; trees (mathematics); vector quantisation; Kohonen map; context quantization; contextual self-organizing maps; continuous input; discrete approximation; near-optimal representations; temporal data; Neurons; Prototypes; Self organizing feature maps; Statistics; Stochastic processes; Unsupervised learning; Vector quantization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location :
Como
ISSN :
1098-7576
Print_ISBN :
0-7695-0619-4
Type :
conf
DOI :
10.1109/IJCNN.2000.859367
Filename :
859367
Link To Document :
بازگشت