Unified active and semi-supervised learning for hyperspectral image classification

2021 
The large-scale labeled data is very crucial to train a classification model with strong generalization ability. However, the collection of large-scale labeled data is very expensive, especially in the remote sensing fields. The available of labeled data is very limited for the hyperspectal image classification. To address such a challenge, active learning and semi-supervised learning are two popular techniques in machine learning community. In this paper, we integrate active learning and semi-supervised learning into a framework by improving the quality of pseudo-labels for hyperspectral remote sensing images. In the proposed method, the collaboration of the spatial features and spectral features are adopted to improve the ability of classifier. Specifically, we train two classifiers with spatial feature and spectral feature respectively based on the labeled data. Then the prediction probabilities of the two classifiers are combined for strong prediction. With active learning technique, we can select a batch of the most informative samples and obtain a new labeled dataset. Two classifiers based on the new labeled dataset can be obtained. With these two classifiers, another prediction results by combining their predictions can be obtained. To guarantee the quality of the pseudo-labels, the samples that are predicted with the same labels before and after active learning are assigned with pseudo-labels. The samples that can not be assigned with high confident samples are regarded as the candidate pool for active learning. The final predictions are obtained by the classification models trained on the pseudo-labeled samples and the labeled samples with both the spatial features and spectral features. The experiments on two popular hyperspectral images show that the proposed method outperforms the state-of-the-art and baseline methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    33
    References
    0
    Citations
    NaN
    KQI
    []