Leveraging spiking deep neural networks to understand neural mechanisms underlying selective attention
2020
Spatial attention enhances sensory processing of goal-relevant information and improves perceptual sensitivity. The specific mechanisms linking neural changes to changes in performance are still contested. Here, we examine different attention mechanisms in spiking deep convolutional neural networks. We directly contrast effects of noise suppression (precision) and two different gain modulation mechanisms on performance on a visual search task with complex real-world images. Unlike standard artificial neurons, biological neurons have saturating activation functions, permitting implementation of attentional gain as gain on a neuron's input or on its outgoing connection. We show that modulating the connection is most effective in selectively enhancing information processing by redistributing spiking activity, and by introducing additional task-relevant information, as shown by representational similarity analyses. Precision did not produce attentional effects in performance. Our results, which mirror empirical findings, show that it is possible to adjudicate between attention mechanisms using more biologically realistic models and natural stimuli.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
88
References
1
Citations
NaN
KQI