DocumentCode :
1301429
Title :
Global convergence of Oja´s subspace algorithm for principal component extraction
Author :
Chen, Tianping ; Hua, Yingbo ; Yan, Wei-Yong
Author_Institution :
Dept. of Math., Fudan Univ., Shanghai, China
Volume :
9
Issue :
1
fYear :
1998
fDate :
1/1/1998 12:00:00 AM
Firstpage :
58
Lastpage :
67
Abstract :
Oja´s principal subspace algorithm is a well-known and powerful technique for learning and tracking principal information in time series. A thorough investigation of the convergence property of Oja´s algorithm is undertaken in this paper. The asymptotic convergence rates of the algorithm is discovered. The dependence of the algorithm on its initial weight matrix and the singularity of the data covariance matrix is comprehensively addressed
Keywords :
convergence of numerical methods; covariance matrices; differential equations; eigenvalues and eigenfunctions; parallel algorithms; statistical analysis; Oja subspace algorithm; asymptotic convergence rates; covariance matrix; differential equations; eigenvalues; feature extraction; global convergence; initial weight matrix; parallel algorithm; principal component analysis; time series; Convergence; Covariance matrix; Data mining; Differential equations; Feature extraction; Information processing; Least squares methods; Principal component analysis; Random processes; Singular value decomposition;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.655030
Filename :
655030
Link To Document :
بازگشت