• DocumentCode
    445947
  • Title

    Diagonally weighted and shifted criteria for minor and principal component extraction

  • Author

    Hasan, Mohammed A.

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Minnesota Univ., Duluth, MN, USA
  • Volume
    2
  • fYear
    2005
  • fDate
    31 July-4 Aug. 2005
  • Firstpage
    1251
  • Abstract
    A framework for a class of minor and principal component learning rules is presented. These rules compute multiple eigenvectors and not only a basis for a multi-dimensional eigenspace. Several MCA/PCA cost functions which are weighted or shifted by a diagonal matrix are optimized subject to orthogonal or symmetric constraints. A number of minor and principal component learning rules for symmetric matrices and matrix pencils, many of which are new, are obtained by exploiting symmetry of constrained criteria. These algorithms may be seen as the counterparts or generalization of Oja´s and Xu´s systems for computing multiple principal component analyzers. Procedures for converting minor component flows into principal component flows are also discussed.
  • Keywords
    eigenvalues and eigenfunctions; learning (artificial intelligence); matrix algebra; principal component analysis; diagonal matrix; diagonally weighted criteria; matrix pencils; multi-dimensional eigenspace; multiple eigenvectors; principal component extraction; principal component learning rules; shifted criteria; symmetric matrices; Algorithm design and analysis; Analysis of variance; Computer displays; Constraint optimization; Cost function; Data mining; Lagrangian functions; Matrix converters; Principal component analysis; Symmetric matrices;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
  • Print_ISBN
    0-7803-9048-2
  • Type

    conf

  • DOI
    10.1109/IJCNN.2005.1556033
  • Filename
    1556033