Title :
Incorporating Motion Blur Compensation to Blind Super Resolution Restoration
Author :
Sakuragi, Ryoichi ; Hamada, Nozomu
Author_Institution :
Signal Process. Lab., Keio Univ., Kanagawa
Abstract :
In various image-processing fields, obtaining high-resolution images are demanded. For example, in medical diagnosis, observed images should have higher resolution than now exists. Recent investigations have demonstrated some novel super resolution methods[1], which generate the high-resolution image utilizing multiple images. It is called super resolution method. This paper proposes an incorporating motion blur compensation method to solving blind super resolution problem. The relative motion between camera and objects in the images generates motion blur in the obtained image. To our knowledge, this kind of situation has not been treated in previous super resolution problems. Because conventional methods usually assume that image doesn´t have motion blur. In our study, we can define motion blur as a linear kernel model. That means motion of objects and/or camera itself as linear moving during generating a frame. Then, we extend blur model as a convolution with camera PSE to the out of focus blur and the motion blur. Additionally, we propose the blur kernel estimation method using MAP algorithm. In this paper, we proposed incorporating motion blur compensation to blind super resolution. We were able to confirm the effectiveness of proposed method by simulation and experimental result
Keywords :
conjugate gradient methods; image resolution; image restoration; motion compensation; MAP algorithm; blind super resolution restoration; blur kernel estimation method; linear kernel model; motion blur compensation; Cameras; Convolution; Focusing; Image generation; Image resolution; Image restoration; Interpolation; Kernel; Probability; Signal resolution; Blind Super Resolution; Conjugate Gradient Method; MAP; Motion Blur;
Conference_Titel :
SICE-ICASE, 2006. International Joint Conference
Conference_Location :
Busan
Print_ISBN :
89-950038-4-7
Electronic_ISBN :
89-950038-5-5
DOI :
10.1109/SICE.2006.315673