language-icon Old Web
English
Sign In

GPU-based SAR Image Lee Filtering

2019 
Aiming at the problem of suppression and removal of speckle noise in SAR images, Lee filter is considered to be a relatively good algorithm by comprehensive considerations. It has good performance in many scenes. The main disadvantage is that its calculation takes a long time and the real-time performance is too poor for large-format high-resolution SAR images. Therefore, the GPU devices can be used for acceleration in algorithm optimization. This paper first implements the classic Lee filtering algorithm in Python environment, and then combines Numba to realize the CPU acceleration of the algorithm. Then, the classical Lee filtering algorithm is repeated by C/C++. Finally, the GPU acceleration of the algorithm is implemented by CUDA. By comparison, the results show that GPU acceleration can increase the speed by nearly two orders of magnitude.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    0
    Citations
    NaN
    KQI
    []