Decoupled Certainty-Driven Consistency Loss for Semi-supervised Learning.

2019 
One of the successful approaches in semi-supervised learning is based on the consistency loss between different predictions under random perturbations. Typically, a student model is trained to be consistent with teachers prediction for the inputs under different perturbation. However, to be successful,the teachers pseudo labels must have good quality, otherwise the whole learning process will fail. Unfortunately, existing methods do not assess the quality of teachers pseudo labels. In this paper, we propose a novel certainty-driven consistency loss (CCL) that exploits the predictive uncertainty information in the consistency loss to let the student dynamically learn from reliable targets. Specifically, we propose two approaches, i.e. Filtering CCL and Temperature CCL to either filter out uncertain predictions or pay less attention on the uncertain ones in the consistency regularization. We combine the two approaches, which we call FT-CCL to further improve consistency learning framework. Based on our experiments, FT-CCL shows improvements on a general semi-supervised learning task and robustness to noisy labels. We further introduce a novel mutual learning method, where one student is decoupled from its teacher, and learns from the other student's teacher, in order to learn additional knowledge. Experimental results demonstrate the advantages of our method over the state-of-the-art semi-supervised deep learning methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    66
    References
    15
    Citations
    NaN
    KQI
    []