A framework for efficiently parallelizing nonlinear noise reduction algorithm
2010
In hyperspectral imagery, noise reduction is a vital and common pre-processing step that needs to be executed accurately and efficiently. Until recently, hyperspectral data was modeled using linear stochastic processes and the noise was assumed to manifest itself in a narrow spatial frequency band. The signal and noise are thus considered independent and most of the proposed noise reduction algorithms transform the hyperspectral data linearly from one space to another for noise and signal separation. Hyperspectral data, however, exhibits nonlinear characteristics making the noise frequency and signal dependent [1, 2]. Therefore, to accurately reduce the noise in hyperspectral data, a nonlinear noise reduction algorithm, such as the one we propose in this paper, must be considered. The algorithm, however, is computationally expensive and requires parallelization. To this end, we offer a framework which we have implemented and evaluated.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
12
References
1
Citations
NaN
KQI