Energy-efficient stochastic computing with superparamagnetic tunnel junctions
2019
Superparamagnetic tunnel junctions have emerged as a competitive, realistic nanotechnology to support novel forms of stochastic computation in CMOS-compatible platforms. One of their applications is to generate random bitstreams suitable for use in stochastic computing implementations. We describe a method for digitally programmable bitstream generation based on pre-charge sense amplifiers which is more energy efficient than previously explored alternatives. The energy savings offered by this digital generator survive when we use them as the fundamental units of a neural network architecture. To take advantage of the potential savings, we codesign the algorithm with the circuit, rather than directly transcribing a classical neural network into hardware. The flexibility of the neural network mathematics compensates for explicitly energy efficient choices we make at the device level. The result is a convolutional neural network design operating at $\approx 150~nJ$ per inference with $97~\%$ performance on MNIST---nearly an order of magnitude more energy efficiency than comparable proposals in the recent literature.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
97
References
1
Citations
NaN
KQI