Fast Evolutionary Neural Architecture Search Based on Bayesian Surrogate Model
2021
Neural Architecture Search (NAS) is studied to automatically design the deep neural network structure, freeing people from heavy network design tasks. Traditional NAS based on individual performance evaluation needs to train many networks generated by the search, and compare the performance of the networks according to their accuracy, which is very time-consuming. In this study, we propose to use a two-category comparator based random forest model as a surrogate to estimate the accuracy of the networks. thereby reducing heavy network training process and greatly saving search time. Instead of directly predicting the accuracy of each network, we propose to compare the relative performance between each two networks in our proposed two-category comparator. Furthermore, we implement the modeling process of the surrogate model in the sampling space of the original training data, which further accelerates the search process of the network in the NAS. Experimental results show that our proposed NAS framework can greatly reduce the search time, while the accuracy of the obtained network is comparable to that of other state-of-the art NAS algorithms.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
30
References
0
Citations
NaN
KQI