• DocumentCode
    3113223
  • Title

    Depth estimation using monocular cues from single image

  • Author

    Salih, Yasir ; Malik, Aamir S. ; May, Zazilah

  • Author_Institution
    Dept. of Electr. & Electron. Eng., Univ. Teknol. PETRONAS, Tronoh, Malaysia
  • fYear
    2011
  • fDate
    19-20 Sept. 2011
  • Firstpage
    1
  • Lastpage
    4
  • Abstract
    This paper investigates depth estimation using monocular cues. Human visual system uses monocular cues such as texture, focus and shading for depth perception. Our proposed algorithm is based on segmenting the image into homogenous segments (superpixels), and then out of these segments we extract the ground segment and the sky segment. These two segments guide the depth estimation by providing region with maximum depth (sky) and region with minimum depth (ground). The reset of the segments will have a depth value between the sky and ground. This algorithm address image that contains sky and ground as a part of the image. The ground acts as a support for segments (eg. Trees, buildings) in the image, thus a vertical image segments tends to have similar depth as its ground support. On the other hand, some images are not supported by the ground but they are connected to it, therefore these segments will have depth value larger than its nearest ground pixels.
  • Keywords
    feature extraction; image segmentation; image texture; visual perception; algorithm; buildings; depth estimation; depth perception; focus; ground pixels; ground segment; homogenous segments; human visual system; image segmentation; monocular cues; shading; single image; sky segment; superpixels; texture; trees; Estimation; Feature extraction; Image color analysis; Image segmentation; Shape; Stereo vision; Three dimensional displays; depth from focus; graph-based segmentation; monocular features; texture feature;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    National Postgraduate Conference (NPC), 2011
  • Conference_Location
    Kuala Lumpur
  • Print_ISBN
    978-1-4577-1882-3
  • Type

    conf

  • DOI
    10.1109/NatPC.2011.6136388
  • Filename
    6136388