DocumentCode :
1749194
Title :
Learning high-degree sequences in a linear network
Author :
Voegtlin, Thomas ; Dominey, Peter F.
Author_Institution :
Inst. des Sci. Cognitives, CNRS, Bron, France
Volume :
2
fYear :
2001
fDate :
2001
Firstpage :
940
Abstract :
An unsupervised learning algorithm for recurrent neural networks is proposed, that generalizes PCA to time series. A linear recurrent neural network using Oja´s constrained Hebbian learning rule is presented. We demonstrate that this network extracts complex temporal information from a sequence of inputs. Temporal sequences stored in the network can be retrieved in the reverse order of presentation, providing a straight-forward implementation of a logical stack
Keywords :
Hebbian learning; principal component analysis; recurrent neural nets; time series; unsupervised learning; Hebbian learning; Oja constraint; linear network; principal component analysis; recurrent neural networks; temporal sequences; time series; unsupervised learning; Artificial neural networks; Data mining; Intelligent networks; Learning automata; Natural languages; Performance analysis; Principal component analysis; Recurrent neural networks; Unsupervised learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-7044-9
Type :
conf
DOI :
10.1109/IJCNN.2001.939486
Filename :
939486
Link To Document :
بازگشت