NAS4RRAM: neural network architecture search for inference on RRAM-based accelerators

2021 
The RRAM-based accelerators enable fast and energy-efficient inference for neural networks. However, there are some requirements to deploy neural networks on RRAM-based accelerators, which are not considered in existing neural networks. (1) Because the noise problem and analog-digital converters/digital-analog converters (ADC/DAC) affect the prediction accuracy, they should be modeled in networks. (2) Because the weights are mapped to the RRAM cells, they should be quantized, and the number of weights is limited by the number of RRAM cells in the accelerator. These requirements motivate us to customize the hardware-friendly network for the RRAM-based accelerator. We take the idea of network architecture search (NAS) to design networks with high prediction accuracy that meet the requirements. We propose a framework called NAS4RRAM to search for the optimal network on the given RRAM-based accelerator. The experiments demonstrate that NAS4RRAM can apply to different RRAM-based accelerators with different scales. The performance of searched networks outperforms the manually designed ResNet.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    38
    References
    3
    Citations
    NaN
    KQI
    []