DoB-SNN: A New Neuron Assembly-Inspired Spiking Neural Network for Pattern Classification
2021
Spiking neural networks (SNNs) as the third generation of artificial neural networks are closer to their biological counterparts than their predecessors. SNNs have a higher computational capacity and lower power requirements than networks of sigmoidal neurons. In this paper, a new spiking neural network for pattern classification referred to as Degree of Belonging SNN (DoB-SNN) is introduced. DoB-SNN is inspired by a neuronal assembly where each neuron has a degree of belonging to every class of data being process. DoB-SNN clusters the neurons during the training process using DoBs to allocate a group of neurons to each class. A new training algorithm is presented to adjust DoBs along with the network's synaptic weights, based on Spike-Timing Dependent Plasticity (STDP) and neurons' activity for training samples. The performance of DoB-SNN is evaluated on five datasets from the UCI machine learning repository. Nested Cross-Validation is employed to determine the network's hyperparameters for each dataset and thoroughly assess generalisation capability. A detailed comparison on these datasets with three other supervised learning algorithms, including SpikeProp, SWAT, and SRESN is provided. The results show that no algorithm significantly outperforms DoB-SNN, Whereas DoB-SNN has significantly better performance than others for Liver disorders dataset (>6.10%, $p ). Accuracies obtained by DoB-SNN are significantly greater than SWAT for both Iris and Breast Cancer (>1.69%, $p ) and significantly better than SpikeProp for Iris (1.62%, $p=0.04$ ). In all comparisons, DoB-SNN used the smallest network, among others. DoB-SNN therefore offers significant potential as alternative SNN architecture and learning algorithm.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
26
References
0
Citations
NaN
KQI