DocumentCode :
1326311
Title :
Algorithms for accelerated convergence of adaptive PCA
Author :
Chatterjee, Chanchal ; Kang, Zhengjiu ; Roychowdhury, Vwani P.
Author_Institution :
BAE Syst. Inc., San Diego, CA, USA
Volume :
11
Issue :
2
fYear :
2000
fDate :
3/1/2000 12:00:00 AM
Firstpage :
338
Lastpage :
355
Abstract :
We derive and discuss adaptive algorithms for principal component analysis (PCA) that are shown to converge faster than the traditional PCA algorithms due to Oja and Karhunen (1985), Sanger (1989), and Xu (1993). It is well known that traditional PCA algorithms that are derived by using gradient descent on an objective function are slow to converge. Furthermore, the convergence of these algorithms depends on appropriate choices of the gain sequences. Since online applications demand faster convergence and an automatic selection of gains, we present new adaptive algorithms to solve these problems. We first present an unconstrained objective function, which can be minimized to obtain the principal components. We derive adaptive algorithms from this objective function by using: (1) gradient descent; (2) steepest descent; (3) conjugate direction; and (4) Newton-Raphson methods. Although gradient descent produces Xu´s LMSER algorithm, the steepest descent, conjugate direction, and Newton-Raphson methods produce new adaptive algorithms for PCA. We also provide a discussion on the landscape of the objective function, and present a global convergence proof of the adaptive gradient descent PCA algorithm using stochastic approximation theory. Extensive experiments with stationary and nonstationary multidimensional Gaussian sequences show faster convergence of the new algorithms over the traditional gradient descent methods. We also compare the steepest descent adaptive algorithm with state-of-the-art methods on stationary and nonstationary sequences
Keywords :
Newton-Raphson method; approximation theory; convergence; eigenvalues and eigenfunctions; gradient methods; least squares approximations; principal component analysis; sequences; LMSER algorithm; Newton-Raphson methods; PCA; accelerated convergence; adaptive algorithms; adaptive principal component analysis; conjugate direction; gain sequences; global convergence proof; gradient descent; multidimensional Gaussian sequences; nonstationary sequences; stationary sequences; steepest descent; stochastic approximation theory; unconstrained objective function; Acceleration; Adaptive algorithm; Algorithm design and analysis; Approximation algorithms; Convergence; Direction of arrival estimation; Frequency estimation; Multiple signal classification; Principal component analysis; Signal processing algorithms;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.839005
Filename :
839005
Link To Document :
بازگشت