Title :
Spatio-temporal image-based texture atlases for dynamic 3-D models
Author :
Jankó, Zsolt ; Pons, Jean-Philippe
Author_Institution :
IMAGINE, Univ. Paris-Est, Paris, France
fDate :
Sept. 27 2009-Oct. 4 2009
Abstract :
In this paper, we propose a method for creating a high-quality spatio-temporal texture atlas from a dynamic 3-D model and a set of calibrated video sequences. By adopting an actual spatio-temporal perspective, beyond independent frame-by-frame computations, we fully exploit the very high redundancy in the input video sequences. First, we drastically cut down on the amount of texture data, and thereby we greatly enhance the portability and the rendering efficiency of the model. Second, we gather the numerous different viewpoint/time appearances of the scene, so as to recover from low resolution, grazing views, highlights, shadows and occlusions which affect some regions of the spatio-temporal model. Altogether, our method allows the synthesis of novel views from a small quantity of texture data, with an optimal visual quality throughout the sequence, with minimally visible color discontinuities, and without flickering artifacts. These properties are demonstrated on real datasets.
Keywords :
image sequences; image texture; calibrated video sequences; dynamic 3D models; grazing views; spatio temporal image based texture atlases; Animation; Cameras; Conferences; Layout; Mesh generation; Redundancy; Reflectivity; Spatiotemporal phenomena; Veins; Video sequences;
Conference_Titel :
Computer Vision Workshops (ICCV Workshops), 2009 IEEE 12th International Conference on
Conference_Location :
Kyoto
Print_ISBN :
978-1-4244-4442-7
Electronic_ISBN :
978-1-4244-4441-0
DOI :
10.1109/ICCVW.2009.5457481