Title :
Camera position and posture estimation from still image using feature landmark database
Author :
Sato, Tomokazu ; Nishiumi, Yoshiyuki ; Susuki, Mitsutaka ; Nakagawa, Tomoka ; Yokoya, Naokazu
Author_Institution :
Nara Inst. of Sci. & Technol., Ikoma
Abstract :
Several human navigation services are currently available on the cellular phones that uses embedded GPS and 2-D map. However, 2-D map based human navigation is not always easy to understand for users because that is not intuitive. In order to realize more intuitive human navigation, AR (augmented reality) based navigation where guiding information is overlaid in the real image is expected to be the next generation navigation system. For AR navigation, the key problem is how to acquire the accurate position and posture of the embedded camera on the cellular phone. Many researchers have intensively tackled to the camera parameter estimation problem for AR in recent years. However, most of these methods cannot be applied to the current mobile devices because they are designed to treat video sequence where temporal information like camera parameter of the previous frame is known. In this research, we propose a novel method that estimates camera parameters of single input image using SIFT features and voting scheme.
Keywords :
augmented reality; cameras; feature extraction; image sequences; pose estimation; radionavigation; video signal processing; visual databases; 2D map based human navigation; SIFT features; augmented reality; camera parameter estimation; camera parameter estimation problem; embedded GPS; feature landmark database; human navigation services; mobile devices; posture estimation; still image; video sequence; voting scheme; Augmented reality; Cameras; Cellular phones; Global Positioning System; Humans; Image databases; Navigation; Parameter estimation; Spatial databases; Video sequences; extrinsic camera parameter estimation; landmark database; user localization;
Conference_Titel :
SICE Annual Conference, 2008
Conference_Location :
Tokyo
Print_ISBN :
978-4-907764-30-2
Electronic_ISBN :
978-4-907764-29-6
DOI :
10.1109/SICE.2008.4654900