Which method to use for optimal structure and function representation of large spiking neural networks: A case study on the NeuCube architecture

2016 
This study analyses different representations of large spiking neural network (SNN) structures for conventional computers and uses the NeuCube SNN architecture as a case study. The representation includes neuronal connectivity and network's and neurons' states during the learning process. Three different structure types, namely adjacency matrix, adjacency list, and edge-weight table, were compared in terms of their storage needs and execution time performance of a learning algorithm, for varying numbers of neurons in the network. Comparative analysis shows that the adjacency list, combined with a backwards indexing mechanism, scales up most efficiently both in terms of performance and of storage requirements. The optimal algorithm was further used to simulate a large scale NeuCube system with 241,606 spiking neurons in a 3D space for prediction and analysis of benchmark spatio-temporal data.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    19
    References
    1
    Citations
    NaN
    KQI
    []