• DocumentCode
    3136023
  • Title

    Discriminant analysis for perceptionally comparable classes

  • Author

    Ma, Bingpeng ; Shan, Shiguang ; Chen, Xilin ; Gao, Wen

  • Author_Institution
    Inst. of Comput. Technol., Chinese Acad. of Sci., Beijing
  • fYear
    2008
  • fDate
    17-19 Sept. 2008
  • Firstpage
    1
  • Lastpage
    6
  • Abstract
    Traditional discriminate analysis treats all the involved classes equally in the computation of the between-class scatter matrix. However, we find that for many vision tasks, the classes to be processed are not equal in perception, i.e. a distance metric can be defined between the classes. Typical examples include head pose classification and age estimation. Aiming at this category of classification problem, this paper proposes a novel discriminant analysis method, called Class Distance based Discriminant Analysis (CDDA). In CDDA, the perceptional distance between two classes is exploited to weight the outer product in the between-class scatter computation, to concentrate more on the classes difficult to separate. Another novelty of CDDA is that to preserve the within-class local structure of multimodal labeled data, the within-class scatter is re-defined by complementing the similarity of the samples pairs in the nearby classes. The method is then applied to head pose classification and age estimation problem, and experimental results demonstrate the effectiveness of CDDA.
  • Keywords
    image classification; matrix algebra; pose estimation; age estimation; between-class scatter matrix; dimensionality reduction; discriminant analysis; head pose classification; perceptionally comparable classes; vision tasks; Computer science; Content addressable storage; Data visualization; Equations; Information analysis; Information processing; Scattering;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Automatic Face & Gesture Recognition, 2008. FG '08. 8th IEEE International Conference on
  • Conference_Location
    Amsterdam
  • Print_ISBN
    978-1-4244-2153-4
  • Electronic_ISBN
    978-1-4244-2154-1
  • Type

    conf

  • DOI
    10.1109/AFGR.2008.4813422
  • Filename
    4813422