Retraining and Regularization to Optimize Neural Networks for Stochastic Computing

2020 
Stochastic computing (SC) is a promising computation technique for applications with huge numbers of individually simple operations. One of the main application targets for SC is the design of convolutional neural networks (CNNs) due to their heavy reliance on multiply-accumulate operations that have compact and power-efficient SC realizations. We present a training optimization method that can improve the accuracy of SC-based CNNs significantly. Using regularization techniques in combination with a newly developed retraining algorithm, we enforce sparsity, reduce the number of required MUX-based additions, and thus minimize undesired downscaling. Our results show that the proposed regularized training procedure (RTP) can reduce the computation time of SC-based NNs by a factor of four without sacrificing classification accuracy while also reducing area in hardware implementations. These findings suggest that SC can be successfully applied to more powerful CNNs than previously thought possible, thus extending the range of potential uses of SC.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    21
    References
    0
    Citations
    NaN
    KQI
    []