A computational model for the neurobiological substrates of visual attention
1995
Abstract Two interesting and complex tasks are performed by the brain in the process of perception: the integration of characteristics leading to an easier recognition of a pattern as a whole (binding), and the extraction of properties that need to be detailed and analyzed (attention). Attention seems to have a reciprocal relation with binding, inasmuch as the latter promotes the composition of features and their dependencies, while the former selects a single characteristic independently of the remainder. Classically, binding is viewed as a process whereby sets of properties are gathered in representative entities, which are themselves linked to form higher level structures, in a sequence that culminates in the total integration of the pattern features in a localized construct. The convergent axonal projections from one cortical area to another would be the neurobiological mechanism through which binding is achieved. Attention comprises the selective excitation of neuronal networks or pathways that stand for specific pattern properties. The thalamus and its reticular nucleus would then be the anatomical substrate of the attentional focus. In this paper we propose a computational model aiming at bringing together the main (and apparently diverging) ideas about binding and attention. Based on experimental data, a neuronal network representing cortical pyramidal cells is assembled, and its structure and function are related to the binding and attention phenomena. Actually, the convergent projections that enlarge the visual receptive field are associated to binding, while a specific change in the pyramidal cell behavior is responsible for attention. Computer simulations are shown which reproduce the electrophysiology of pyramidal cells and mimic some interesting experimental results in visual attention. We conclude by conjecturing that attention is a driven interruption in the regular process of binding.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
25
References
12
Citations
NaN
KQI