• DocumentCode
    614608
  • Title

    Joint multitask feature learning and classifier design

  • Author

    Gutta, Sandeep ; Qi Cheng

  • Author_Institution
    Sch. of Electr. & Comput. Eng., Oklahoma State Univ., Stillwater, OK, USA
  • fYear
    2013
  • fDate
    20-22 March 2013
  • Firstpage
    1
  • Lastpage
    5
  • Abstract
    The problem of classification arises in many realworld applications. Often classification of more than two classes is broken down into a group of binary classification problems using the one-versus-rest or pairwise approaches. For each binary classification problem, feature selection and classifier design are usually conducted separately. In this paper, we propose a new multitask learning approach in which feature selection and classifier design for all the binary classification tasks are carried out simultaneously. We consider probabilistic nonlinear kernel classifiers for binary classification. For each binary classifier, we give weights to the features within the kernels. We assume that the matrix consisting of all the feature weights for all the tasks has a sparse component and a low rank component. The sparse component determines the features that are relevant to each classifier, and the low rank component determines the common feature subspace that is relevant to all the classifiers. Experimental results on synthetic data demonstrate that the proposed approach achieves higher classification accuracy compared to the conventional classifiers. The proposed method accurately determines the relevant features that are important to each binary classifier.
  • Keywords
    learning (artificial intelligence); pattern classification; sparse matrices; binary classification problems; classification accuracy; classifier design; common feature subspace; feature selection; feature weights; joint multitask feature learning; low rank component; matrix; one-versus-rest approach; pairwise approach; probabilistic nonlinear kernel classifiers; sparse component; synthetic data; Joints; Kernel; Matrix decomposition; Optimization; Sparse matrices; Support vector machines; Vectors; Classification; feature selection; multitask learning; sparsity;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Sciences and Systems (CISS), 2013 47th Annual Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    978-1-4673-5237-6
  • Electronic_ISBN
    978-1-4673-5238-3
  • Type

    conf

  • DOI
    10.1109/CISS.2013.6552296
  • Filename
    6552296