Recursive Synaptic Bit Reuse: An Efficient Way to Increase Memory Capacity in Associative Memory

2019 
Neural associative memory (AM) is one of the critical building blocks for cognitive computing systems. It memorizes (learns) and retrieves input data by information content itself. One of the key challenges of designing AM for intelligent devices is to expand memory capacity while using a minimal amount of hardware and energy resources. However, prior arts show that memory capacity increases slowly, i.e., in square root with the total number of synaptic weights. To tackle this problem, we propose a synapse model called recursive synaptic bit reuse, which enables near-linear scaling of memory capacity with total synaptic bits. Our model can also handle input data that are correlated more robustly than the conventional model. We evaluated our model in the context of Hopfield neural networks (HNNs) that contain 5–327-KB data storage for synaptic weights. Our model can increase the memory capacity of HNNs as large as $30\times $ over the conventional ones. The very large scale integration implementation of HNNs in 65 nm confirms that our proposed model can save up to $19\times $ area and up to $232\times $ energy dissipation as compared to the conventional model. These savings are expected to grow with the network size.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    49
    References
    0
    Citations
    NaN
    KQI
    []