GPU implementation of RX detection using spectral derivative features

2018 
Hyperspectral image (HSI), which can record abundance information of a pixel, has shown huge potential on many applications such as image classification, target and anomaly detection and so on. Nowadays, anomaly detection has attracted more attention because there is no limitation of spectral library. A standard approach for anomaly detection is the method developed by Reed and Xiaoli, called RX algorithm. However, the data volume is getting bigger with the developing of imaging technology. A problem that ensues is the rapid increase of computation complexity and this will lead a time-consumed application. In addition, there will be noise in HSI with the influence of illumination and atmospheric. In this paper, we realize an implementation of RX algorithm on NVIDIA GeForce 1060 GPU with the utilization of derivative features. On one hand, the GPU parallel implementation reach the purpose of real-time processing and it also eliminates the storage burden of on-board processing. On the other hand, the derivative features have better performance on salient features detection and noise restraint. Thus, it can further promote the detection performance of RXD. In our experiments, three real HSI datasets were tested to verify the effect of GPU parallel implementation. The experiment results had indicated that the utilization of derivative features can promote the detection performance. Compared with serial computation, the parallel implementation achieves a great reduction on processing time.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    35
    References
    4
    Citations
    NaN
    KQI
    []