• DocumentCode
    595234
  • Title

    Dense reconstruction by stereo-motion under perspective camera model

  • Author

    Mu Fang ; Chung, Ronald

  • Author_Institution
    Dept. of Mech. & Autom. Eng., Chinese Univ. of Hong Kong, Hong Kong, China
  • fYear
    2012
  • fDate
    11-15 Nov. 2012
  • Firstpage
    2480
  • Lastpage
    2483
  • Abstract
    This paper presents a new stereo-motion approach for 3D scene reconstruction in dense and accurate form, that allows the cameras to be described by the full perspective model. Given a short and arbitrary motion of a stereo rig of camera, the projective depth of every image point can be recovered from the rank-four property of a matrix that comprises the image positions of the scene, and the associated 3D position can thereby be determined accurately. Compared to earlier methods, the approach allows the use of the full perspective model to describe the cameras, and thus can attain a higher accuracy. In addition, the projective depths are recovered without the need of having initial guess of the depth map or going through iterations or approximation. The recovery process demands only a few stereo correspondences over the entire scene to start with, and points that are occluded in some of the views can also be reconstructed. Experiments on real image sequences are shown to illustrate the effectiveness of the approach.
  • Keywords
    cameras; image motion analysis; image reconstruction; image sequences; matrix algebra; stereo image processing; 3D position; 3D scene reconstruction; camera stereo rig; dense reconstruction; depth map; full perspective model; image point projective depth; image sequences; matrix rank-four property; perspective camera model; recovery process; stereo correspondence; stereo-motion approach; Cameras; Computer vision; Image reconstruction; Image sequences; Pattern recognition; Stereo vision; Visualization;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Pattern Recognition (ICPR), 2012 21st International Conference on
  • Conference_Location
    Tsukuba
  • ISSN
    1051-4651
  • Print_ISBN
    978-1-4673-2216-4
  • Type

    conf

  • Filename
    6460670