Training Spiking Neural Networks with Synaptic Plasticity under Integer Representation

2021 
Neuromorphic computing is emerging as a promising Beyond Moore computing paradigm that employs event-triggered computation and non-von Neumann hardware. Spike Timing Dependent Plasticity (STDP) is a well-known bio-inspired learning rule that relies on activities of locally connected neurons to adjust the weights of their respective synapses. In this work, we analyze a basic STDP rule and its sensitivity on the different hyperparameters for training spiking neural networks (SNNs) with supervision, customized for a neuromorphic hardware implementation with integer weights. We compare the classification performance on four UCI datasets (iris, wine, breast cancer and digits) that depict varying levels of complexity. We perform a search for optimal set of hyperparameters using both grid search and Bayesian optimization. Through the use of Bayesian optimization, we show the general trends in hyperparameter sensitivity in SNN classification problem. With the best sets of hyperparameters, we achieve accuracies comparable to some of the best performing SNNs on these four datasets. With a highly optimized supervised STDP rule we show that these accuracies can be achieved with just 20 epochs of training.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    14
    References
    0
    Citations
    NaN
    KQI
    []