Fast and Unsupervised Neural Architecture Evolution for Visual Representation Learning

2021 
Unsupervised visual representation learning is one of the hottest topics in computer vision, yet performance still lags behind compared with the best supervised learning methods. At the same time, neural architecture search (NAS) has produced state-of-the-art results on various visual tasks. It is a natural idea to explore NAS as a way to improve unsupervised representation learning, yet it remains largely unexplored. In this paper, we propose a Fast and Unsupervised Neural Architecture Evolution (FaUNAE) method to evolve an existing architecture, manually constructed or the result of NAS on a small dataset, to a new architecture that can operate on a larger dataset. This partial optimization can utilize prior knowledge to reduce search cost and improve search efficiency. The evolution is self-supervised where the contrast loss is used as the evaluation metric in a student-teacher framework. By eliminating the inferior or least promising operations, the evolutionary process is greatly accelerated. Experimental results show that we achieve state-of-the-art performance for downstream applications, such as object recognition, object detection, and instance segmentation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    40
    References
    0
    Citations
    NaN
    KQI
    []