Title :
Non-Divergence of Stochastic Discrete Time Algorithms for PCA Neural Networks
Author :
Jian Cheng Lv ; Zhang Yi ; Yunxia Li
Author_Institution :
Machine Intell. Lab., Sichuan Univ., Chengdu, China
Abstract :
Learning algorithms play an important role in the practical application of neural networks based on principal component analysis, often determining the success, or otherwise, of these applications. These algorithms cannot be divergent, but it is very difficult to directly study their convergence properties, because they are described by stochastic discrete time (SDT) algorithms. This brief analyzes the original SDT algorithms directly, and derives some invariant sets that guarantee the nondivergence of these algorithms in a stochastic environment by selecting proper learning parameters. Our theoretical results are verified by a series of simulation examples.
Keywords :
discrete time systems; learning (artificial intelligence); neural nets; principal component analysis; stochastic processes; PCA neural networks; SDT algorithm; learning algorithm; principal component analysis; stochastic discrete time algorithm nondivergence; Algorithm design and analysis; Approximation algorithms; Convergence; Heuristic algorithms; Neural networks; Principal component analysis; Signal processing algorithms; Neural networks; nondivergence; principal component analysis (PCA); stochastic discrete time (SDT) method; stochastic discrete time (SDT) method.;
Journal_Title :
Neural Networks and Learning Systems, IEEE Transactions on
DOI :
10.1109/TNNLS.2014.2312421