Title :
LBG-m: a modified LBG architecture to extract high-order neural structures
Author_Institution :
ITD, CNR, Palermo, Italy
Abstract :
Neural networks that learn in an unsupervised way and generate their topology during learning can be useful to build topology representing structures. These networks can be used for vector quantization and clustering each time it is necessary to characterize the topology of the underlying data distribution. A drawback of these networks is that the structures created have the same complexity as the input data, so a simplification of the structure is needed to allow the user to visualize and manipulate these representations. The aim of the proposed algorithm is to simplify the graph structure created by these kinds of neural networks. The LBG-m algorithm takes the position of the nodes and the adjacency matrix of the graph as input and builds an over-imposed graph that clusters the graph nodes and tries to reproduce the “shape” of the input graph
Keywords :
Hebbian learning; graph theory; matrix algebra; neural nets; unsupervised learning; LBG-m; adjacency matrix; clustering; graph structure; high-order neural structures; modified LBG architecture; over-imposed graph; topology representing structures; underlying data distribution; unsupervised learning; vector quantization; Clustering algorithms; Data visualization; Network topology; Neural networks; Vector quantization;
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-7044-9
DOI :
10.1109/IJCNN.2001.939458