Adaptive Weighted Losses With Distribution Approximation for Efficient Consistency-Based Semi-Supervised Learning

2022 
Recent semi-supervised learning (SSL) algorithms such as FixMatch achieve state-of-the-art performance by exploiting consistency regularization and entropy minimization techniques. However, many consistency-based SSL algorithms extract pseudo-labels from unlabeled data through a fixed threshold and ignore the different learning progress of each category, which makes the easy-to-learn categories have more examples contributing to the loss, resulting in a class-imbalance problem and affecting training efficiency. In order to improve the training reliability, we propose adaptive weighted losses (AWL). Through the evaluation of the class-wise learning progress, the loss contribution of the pseudo-labeled data of each category is continuously and dynamically adjusted during the learning process, and the pseudo-label discrimination ability of the model can be steadily improved. Moreover, to improve the training efficiency, we propose a bidirectional distribution approximation (DA) method, which introduces the consistency information of the predictions under the threshold into the loss calculation, and significantly improves the model convergence speed. Through the combination of AWL and DA, our method surpasses the performance of other algorithms on multiple benchmarks with a faster convergence efficiency, especially in the case of labeled data extremely limited. For example, AWL&DA achieves 95.29% test accuracy on the CIFAR-10-40-labels experiment and 92.56% accuracy on a faster experiment setting with only $2^{18}$ iterations.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    46
    References
    0
    Citations
    NaN
    KQI
    []