Fusion of infrared and visible images with propagation filtering

2018 
Abstract Edge-preserving filters have been widely applied to fusion of infrared and visible images in high quality. However, there exists the performance degradation caused by explicit spatial kernel functions in the difficulty of determining predefined pixel neighborhood regions with traditional edge-preserving filtering decomposition. To remedy such deficiency, a novel method with propagation filtering is proposed for fusion of infrared and visible images, which consists of three steps: (1) Source images are decomposed with gaussian and propagation filters. (2) Rules are designed to achieve base layer combination and detail layer combination. (3) Reconstruction is performed to acquire the fused image. Compared with traditional multi-scale decomposition based on edge-preserving, spatial kernels are not included in the proposed decomposition task, which solves the difficulty of determining predefined pixel neighborhood regions. Moreover, saliency growth and artifact suppression are reached to combine base layers. In addition, by weighted least square optimization, more visual details and less irrelevant infrared information are added into combined detail layers to satisfy human’s perception. As demonstrated in the experimental results, the proposed method achieves much fusion performance compared with other commonly used image fusion algorithms.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    7
    Citations
    NaN
    KQI
    []