• DocumentCode
    384258
  • Title

    Fast linear discriminant analysis for on-line pattern recognition applications

  • Author

    Moghaddam, H. Abrishami ; Zadeh, Kh Amiri

  • Author_Institution
    K.N. Toossi Univ. of Technol., Tehran, Iran
  • Volume
    2
  • fYear
    2002
  • fDate
    2002
  • Firstpage
    64
  • Abstract
    In this paper, a new adaptive algorithm for Linear Discriminant Analysis (LDA) is presented. The major advantage of the algorithm is the fast convergence rate, which distinguishes it from the existing on-line methods. Current adaptive methods based on the gradient descent optimization technique use a fixed or a monotonically decreasing step size in each iteration. In this work, we use the steepest descent optimization method to optimally determine the step size in each iteration. It is shown that an optimally variable step size, significantly improves the convergence rate of the algorithm, compared to the conventional methods. The new algorithm has been implemented using a self-organized neural network and its advantages in on-line pattern recognition applications are demonstrated.
  • Keywords
    convergence of numerical methods; feature extraction; gradient methods; image recognition; self-organising feature maps; adaptive algorithm; fast convergence rate; fast linear discriminant analysis; iteration step size; on-line pattern recognition; optimally variable step size; self-organized neural network; steepest descent optimization method; Acceleration; Adaptive algorithm; Algorithm design and analysis; Convergence; Face detection; Feature extraction; Linear discriminant analysis; Optimization methods; Pattern recognition; Principal component analysis;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Pattern Recognition, 2002. Proceedings. 16th International Conference on
  • ISSN
    1051-4651
  • Print_ISBN
    0-7695-1695-X
  • Type

    conf

  • DOI
    10.1109/ICPR.2002.1048237
  • Filename
    1048237