Title :
A Novel Image Fusion Algorithm based on PCNN and Contrast
Author :
Qiguang, Miao ; Wang Baoshu
Author_Institution :
Sch. of Comput. Sci., Xidian Univ., Xi´´an
Abstract :
The proposed new fusion algorithm is based on the improved pulse coupled neural network (PCNN) model and the contrast of the image. Compared with the traditional algorithms where the linking strength of each neuron is the same and its value is chosen through experimentation, this algorithm uses the contrast of each pixel as its value, so that the linking strength of each pixel can be chosen adaptively. After the processing of PCNN with the adaptive linking strength, new fire mapping images are obtained for each image taking part in the fusion. The clear objects of each original image are decided by the compare-selection operator with the fire mapping images pixel by pixel and then all of them are merged into a new clear image. The compare-selection operator uses not only the fire time, but also the improved consistency check of the neighborhood of the certain pixel. Furthermore, by this algorithm, other parameters, for example, Delta, the threshold adjusting constant, only have a slight effect on the new fused image. Therefore, it overcomes the difficulty in adjusting parameters in PCNN. Experiments show that the proposed algorithm works better in preserving the edge and texture information than the wavelet transform method and the Laplacian pyramid method do in image fusion
Keywords :
image fusion; image texture; neural nets; PCNN; compare-selection operator; edge preservation; fire mapping image; image contrast; image fusion algorithm; pulse coupled neural network; texture information; Artificial neural networks; Fires; Humans; Image fusion; Image processing; Image segmentation; Joining processes; Neurons; Pixel; Wavelet transforms;
Conference_Titel :
Communications, Circuits and Systems Proceedings, 2006 International Conference on
Conference_Location :
Guilin
Print_ISBN :
0-7803-9584-0
Electronic_ISBN :
0-7803-9585-9
DOI :
10.1109/ICCCAS.2006.284695