An joint end-to-end framework for learning with noisy labels

2021 
Abstract Deep neural networks (DNNs) have achieved excellent performance in image classification research, part of which is due to the large-scale training data with accurate annotations. However, it is expensive and time-consuming to collect such clean data. In contrast, when collecting a dataset by crawling from websites, noisy labels are ubiquitous, which makes it easy for deep neural networks (DNNs) to overfit noisy labels and cause performance degradation. Most recent efforts have been focused on defending noisy labels by roughly ignoring some samples with high losses, which are treated as noise or reweight the training data in the loss function. Both strategies inevitably have priori conditions, such as a clean validation set or a ground-truth noise transition matrix, which are impractical in real-world datasets. In this paper, we propose a novel end-to-end framework for noise correction, called End-to-end Correction with Mixup and Balance terms (ECMB). ECMB can completely correct noisy labels to true labels and keep the number of each class more balanced. This framework uses a backbone network that is pre-trained by using an improved Mixup entropy instead of the traditional cross entropy, and does not need any extra conditions. In addition, we introduce a new balance term that can update noisy labels more accurately. Compared with other state-of-the-art methods, the experimental results on publicly available CIFAR-10, CIFAR- 100 and Clothing1M datasets demonstrate that our method has superior performance.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    39
    References
    1
    Citations
    NaN
    KQI
    []