DocumentCode :
605635
Title :
Multi-modality depth map fusion using primal-dual optimization
Author :
Ferstl, D. ; Ranftl, R. ; Ruther, M. ; Bischof, H.
Author_Institution :
Inst. for Comput. Graphics & Vision, Graz Univ. of Technol., Graz, Austria
fYear :
2013
fDate :
19-21 April 2013
Firstpage :
1
Lastpage :
8
Abstract :
We present a novel fusion method that combines complementary 3D and 2D imaging techniques. Consider a Time-of-Flight sensor that acquires a dense depth map on a wide depth range but with a comparably small resolution. Complementary, a stereo sensor generates a disparity map in high resolution but with occlusions and outliers. In our method, we fuse depth data, and optionally also intensity data using a primal-dual optimization, with an energy functional that is designed to compensate for missing parts, filter strong outliers and reduce the acquisition noise. The numerical algorithm is efficiently implemented on a GPU to achieve a processing speed of 10 to 15 frames per second. Experiments on synthetic, real and benchmark datasets show that the results are superior compared to each sensor alone and to competing optimization techniques. In a practical example, we are able to fuse a Kinect triangulation sensor and a small size Time-of-Flight camera to create a gaming sensor with superior resolution, acquisition range and accuracy.
Keywords :
image sensors; sensor fusion; stereo image processing; complementary 2D imaging techniques; complementary 3D imaging techniques; disparity map; multimodality depth map fusion; primal-dual optimization; stereo sensor; time-of-flight sensor; Cameras; Data models; Fuses; Image resolution; Noise; Optimization; Robot sensing systems;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computational Photography (ICCP), 2013 IEEE International Conference on
Conference_Location :
Cambridge, MA
Print_ISBN :
978-1-4673-6463-8
Type :
conf
DOI :
10.1109/ICCPhot.2013.6528305
Filename :
6528305
Link To Document :
بازگشت