DocumentCode :
1628515
Title :
A self-structurizing neural network for online incremental learning
Author :
Hasegawa, Osamu ; Shen, Furao
Author_Institution :
Tokyo Inst. of Technol., Midoriku, Japan
Volume :
3
fYear :
2004
Firstpage :
2063
Abstract :
An on-line unsupervised learning mechanism is proposed for unlabeled data, which is polluted by noise. By using a similarity threshold and local error based insertion criterion, the system is able to grow incrementally and accommodate input patterns of on-line non-stationary data distribution. The definition of a utility parameter - "error-radius" - enables this system to learn the number of nodes needed to solve a current task. The usage of a new technique for removing nodes in low probability density regions can separate the clusters with low-density overlaps and dynamically eliminate the noise in the input data. The design of a two-layer neural network makes it possible for this system to represent the topological structure of unsupervised on-line data, report the reasonable number of clusters and give typical prototype patterns of every cluster without any priori conditions such as a suitable number of nodes or a good initial codebook.
Keywords :
neural nets; pattern clustering; probability; unsupervised learning; error radius; machine learning; nonstationary data distribution; online incremental learning; probability; self-structurizing neural network; topological structure; unsupervised learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
SICE 2004 Annual Conference
Conference_Location :
Sapporo
Print_ISBN :
4-907764-22-7
Type :
conf
Filename :
1491783
Link To Document :
بازگشت