SPSNN: nth Order Sequence-Predicting Spiking Neural Network
2020
We introduce a means of harnessing spiking neural networks (SNNs) with rich dynamics as a dynamic hypothesis to learn complex sequences. The proposed SNN is referred to as nth order sequence-predicting SNN (n-SPSNN), which is capable of single-step prediction and sequence-to-sequence prediction, i.e., associative recall. As a key to these capabilities, we propose a new learning algorithm, named the learning by backpropagating action potential (LbAP) algorithm, which features (i) postsynaptic event-driven learning, (ii) access to topological and temporal local data only, (iii) competition-induced weight normalization effect, and (iv) fast learning. Most importantly, the LbAP algorithm offers a unified learning framework over the entire SPSNN based on local data only. The learning capacity of the SPSNN is mainly dictated by the number of hidden neurons h; its prediction accuracy reaches its maximum value (~1) when the hidden neuron number h is larger than twice training sequence length l, i.e., h ≥ 2l. Another advantage is its high tolerance to errors in input encoding compared to the state-of-the-art sequence learning networks, namely long short-term memory (LSTM) and gated recurrent unit (GRU). Additionally, its efficiency in learning is approximately 100 times that of LSTM and GRU when measured in terms of the number of synaptic operations until successful training, which corresponds to multiply-accumulate operations for LSTM and GRU. This high efficiency arises from the higher learning rate of the SPSNN, which is attributed to the LbAP algorithm. The code is available on-line (https://github.com/galactico7/SPSNN).
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
0
References
4
Citations
NaN
KQI