• DocumentCode
    1760868
  • Title

    Personal Clothing Retrieval on Photo Collections by Color and Attributes

  • Author

    Xianwang Wang ; Tong Zhang ; Tretter, Daniel R. ; Qian Lin

  • Author_Institution
    Hewlett-Packard Labs., Hewlett-Packard Co., Palo Alto, CA, USA
  • Volume
    15
  • Issue
    8
  • fYear
    2013
  • fDate
    Dec. 2013
  • Firstpage
    2035
  • Lastpage
    2045
  • Abstract
    Automatic personal clothing retrieval on photo collections, i.e., searching the same clothes worn by the same person, is not a trivial problem as photos are usually taken under completely uncontrolled realistic imaging conditions. Typically, the captured clothing images have large variations due to geometric deformation, occlusion, cluttered background, and photometric variability from illumination and viewpoint, which pose significant challenges to text-based or reranking-based visual search methods. In this paper, a novel framework is presented to tackle these issues by leveraging low-level features (e.g., color) and high-level features (attributes) of clothing. First, a content-based image retrieval (CBIR) approach based on the bag-of-visual-words (BOW) model is developed as our baseline system, in which a codebook is constructed from extracted dominant color patches. A reranking approach is then proposed to improve search quality by exploiting clothing attributes, including the type of clothing, sleeves, patterns, etc. Compared to low-level features, the attributes have better robustness to clothing variations, and carry semantic meanings as high-level image representations. Different visual attribute detectors are learned from large amounts of training data to extract the corresponding attributes. The construction of codebook and building of attribute classifiers are conducted offline, which leads to fast online search performance. Extensive experiments on photo collections show that the reranking algorithm based on attribute learning significantly improves retrieval performance in combination with the proposed baseline. Even our color-based baseline alone outperforms the previous CBIR-based search approaches. The experiments also demonstrate that our approach is robust to large variations of images taken in unconstrained environment.
  • Keywords
    clothing; content-based retrieval; image colour analysis; image representation; image retrieval; BOW model; CBIR; automatic personal clothing retrieval; bag-of-visual-words; cluttered background; geometric deformation; image representations; online search performance; photo collections; photometric variability; realistic imaging; unconstrained environment; visual search methods; Clothing; Image color analysis; Image retrieval; Lighting; Robustness; Semantics; Visualization; Attribute learning; clothing retrieval; color matching; reranking;
  • fLanguage
    English
  • Journal_Title
    Multimedia, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1520-9210
  • Type

    jour

  • DOI
    10.1109/TMM.2013.2279658
  • Filename
    6585791