Title :
Variational Depth from Defocus in real-time
Author :
Ben-Ari, Rami ; Raveh, Gonen
Author_Institution :
Orbotech Ltd., Yavne, Israel
Abstract :
With emerging of next generation of digital cameras offering a 3D reconstruction of a viewed scene, Depth from Defocus (DFD) presents an attractive option. In this approach the depth profile of the scene is recovered from two views captured in different focus setting. The DFD is well known as a computationally-intensive method due to the shift-variant filtering involved with its estimation. In this paper we present a parallel GPGPU implementation of DFD based on the variational framework, enabling computation up to 15 frames/sec for a SVGA sequence. This constitutes the first GPU application and the fastest implementation known for passive DFD. The speed-up is obtained by using the novel Fast Explicit Diffusion approach and the fine grain data parallelism in an explicit scheme. We evaluate our method on publicly available real data and compare its results to a recently published PDE based method. The proposed method outperforms previous DFD techniques in terms of accuracy/runtime, suggesting the DFD as an alternative for 3D reconstruction in real-time.
Keywords :
cameras; filtering theory; graphics processing units; image reconstruction; image sequences; solid modelling; 3D reconstruction; DFD technique; GPGPU implementation; PDE based method; SVGA sequence; computationally-intensive method; digital camera; fast explicit diffusion approach; fine grain data parallelism; real-time defocussing; shift-variant filtering; variational depth; Convolution; Graphics processing unit; Kernel; Mathematical model; Minimization; Performance evaluation; Real time systems;
Conference_Titel :
Computer Vision Workshops (ICCV Workshops), 2011 IEEE International Conference on
Conference_Location :
Barcelona
Print_ISBN :
978-1-4673-0062-9
DOI :
10.1109/ICCVW.2011.6130287