Temperature network for few-shot learning with distribution-aware large-margin metric

2021 
Abstract Few-shot learning learns to classify unseen data with few training samples in hand and has attracted increasing attentions recently. In this paper, we propose a novel Temperature Network to tackle few-shot learning tasks motivated by three crucial factors that are seldom considered in the existing literature. First, to encourage compact intra-class distribution, a general improvement for prototype-based methods is proposed to ensure compact intra-class distribution and the effectiveness is theoretically and experimentally validated. Second, the proposed Temperature Network can implicitly generate query-specific prototypes and thus enjoys a more effective distribution-aware metric. Third, to further strengthen the generalization ability of the proposed model, a novel and simple large-margin based method is developed by leveraging the temperature function and we gradually tune the learning temperature to stabilize the training process. Moreover, we note that the commonly used datasets in few-shot learning are actually contrived from large-scale datasets, and thus may not represent a real few-shot problem. We propose a real-life few shot problem, i.e., Dermnet skin disease, to comprehensively evaluate the performance of few-shot learning methods. Experiments conducted on conventional datasets as well as the proposed skin disease dataset demonstrate the superiority of the proposed method over other state-of-the-art methods. The source code of our method is available. 1
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    61
    References
    5
    Citations
    NaN
    KQI
    []