Comparative Study on Quantization-Aware Training of Memristor Crossbars for Reducing Inference Power of Neural Networks at The Edge

2021 
As Internet-of- Things (IoT) technology is spreading widely in human life, a massive number of IoT sensors and edge devices generate huge amounts of unstructured data everywhere and every time. To mitigate the energy burden of computation and communication for processing these huge data at the cloud servers, edge intelligence becomes essential in IoT sensors. In this paper, for implementing edge intelligence in IoT sensors, a comparative study on the training of memristor crossbars is carried out for reducing the crossbar's inference power at the edge. For understanding the relationship of Convolutional Neural Network (CNN) architecture and crossbar's power consumption, memristor-crossbar CNNs with different synapse types, different kernel sizes, and different percentages of Low Resistance State (LRS) cells in the crossbar are compared and analyzed in this paper. After the comparative study, ternary synapse, small kernel size, and reduced number of active bits can be suggested for achieving a higher recognition rate and lower crossbar's power consumption than the other memristor-crossbar CNNs. Adjusting the percentage of LRS cells in the crossbar indicates that the recognition rate begins to fall sharply when the percentage of LRS cells becomes less than 10% of the total memristor cells, for Modified National Institute of Standards and Technology (MNIST) and Canadian Institute For Advanced Research (CIFAR-10) datasets. To minimize the recognition rate loss due to the reduction of active bits, the quantization-aware training of memristor crossbars is combined with the optimization of crossbar's inference power. Here the training with weight quantization can be repeated to minimize the recognition rate loss until the inference power consumption of memristor-crossbar CNN reaches a target inference power.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    18
    References
    0
    Citations
    NaN
    KQI
    []