Title :
Magnification in divergence based neural maps
Author :
Villmann, T. ; Haase, S.
Author_Institution :
Dept. for Math./Natural & Comput. Sci, Univ. of Appl. Sci. Mittweida, Mittweida, Germany
fDate :
July 31 2011-Aug. 5 2011
Abstract :
In this paper, we consider the magnification behavior of neural maps using several (parametrized) divergences as dissimilarity measure instead of the Euclidean distance. We show experimentally that optimal magnification, i.e. information optimum data coding by the prototypes, can be achieved for properly chosen divergence parameters. Thereby, the divergences considered here represent all main classes of divergences. Hence, we can conclude that information optimal vector quantization can be processed independently from the divergence class by appropriate parameter setting.
Keywords :
self-organising feature maps; vector quantisation; Euclidean distance; divergence based neural map magnification behavior; information optimal vector quantization; information optimum data coding; Entropy; Euclidean distance; Prototypes; Self organizing feature maps; Vector quantization;
Conference_Titel :
Neural Networks (IJCNN), The 2011 International Joint Conference on
Conference_Location :
San Jose, CA
Print_ISBN :
978-1-4244-9635-8
DOI :
10.1109/IJCNN.2011.6033254