ESAE: Evolutionary Strategy-Based Architecture Evolution

2019 
Although deep neural networks (DNNs) play important roles in many fields, the architecture design of DNNs can be challenging due to the difficulty of input data representation, the huge number of parameters and the complex layer relationships. To overcome the obstacles of architecture design, we developed a new method to generate the optimal structure of DNNs, named Evolutionary Strategy-based Architecture Evolution (ESAE), consisting of a bi-level representation and a probability distribution learning approach. The bi-level representation encodes architectures in the gene and parameter levels. The probability distribution learning approach ensures the efficient convergence of the architecture searching process. By using Fashion-MNIST and CIFAR-10, the effectiveness of the proposed ESAS is verified. The evolved DNNs, starting from a trivial initial architecture with one single convolutional layer, achieved the accuracies of 94.48% and 93.49% on Fashion-MNIST and CIFAR-10, respectively, and require remarkably less hardware costs in terms of GPUs and running time, compared with the existing state-of-the-art manual screwed architectures.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    39
    References
    2
    Citations
    NaN
    KQI
    []