NVM Device-Based Deep Inference Architecture Using Self-gated Activation Functions (Swish)

2021 
A Non-Volatile Memory (NVM) device-based deep inference architecture and a novel design of Swish activation function using analog components is proposed in this paper. The deep neural network is built using the 1T-1RRAM crossbar structure. This paper presents the importance of activation functions in analog hardware, compares the proposed Self-Gated activation function with the existing designs in literature, and implements a deep inference architecture using multiple datasets. The design has been evaluated for total power (peak), operating voltage, resistance characteristics, speed and the results indicate that the self-gated activation functions with RRAM device outperform Sigmoid & ReLU functions with memristors. The total power (peak) of the activation function circuit is reduced by 83.4% and the operating voltage by 60% compared to sigmoid with memristors and ON/OFF ratio by 23.49 compared to ReLU with memristors. The performance analysis of the inference architecture on iris, balance scale, and bank note authentication datasets have also been demonstrated. The observed classification accuracy of iris and bank note authentication datasets is 100% and 99.87% on the balance scale dataset. The analog hardware design of the deep neural network has been implemented in UMC 180 nm technology node, and the network has been trained offline using MathWorks®-MATLAB.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    14
    References
    0
    Citations
    NaN
    KQI
    []