Title :
Learning Illumination Models While Tracking
Author :
Xu, Yilei ; Roy-Chowdhury, Amit K.
Author_Institution :
Dept. of Electr. Eng., Univ. of California, Riverside, CA
Abstract :
In this paper we present a method for estimation of 3D motion of a rigid object from a video sequence, while simultaneously learning the parameters of an illumination model that describe the lighting conditions under which the video was captured. This is achieved by alternately estimating motion and illumination parameters in a recently proposed mathematical framework for integrating the effects of motion, illumination and structure. The motion is represented in terms of translation and rotation of the object centroid, and the illumination is represented using a spherical harmonics linear basis. The method does not assume any model for the variation of the illumination conditions - lighting can change slowly or drastically, locally or globally. Also, it can be composed of combinations of point and extended sources. For multiple cameras viewing an object, we derive a new photometric constraint that relates the illumination parameters in two or more independent video sequences. This constraint allows verification of the illumination parameters obtained from multiple views and synthesis of new views under the same lighting conditions. We demonstrate the effectiveness of our algorithm in tracking under severe changes of lighting conditions.
Keywords :
image sequences; lighting; motion estimation; solid modelling; video signal processing; 3D motion estimation; illumination model; mathematical framework; photometric constraint; spherical harmonics linear basis; video sequence; Data processing; Data visualization; Lighting;
Conference_Titel :
3D Data Processing, Visualization, and Transmission, Third International Symposium on
Conference_Location :
Chapel Hill, NC
Print_ISBN :
0-7695-2825-2
DOI :
10.1109/3DPVT.2006.88