Concurrent perception and action: Minimal interference between visual identification and pointing
0
Citation
0
Reference
20
Related Paper
Keywords:
Identification
Cite
Cite
Citations (112)
Actions taking place in the environment are critical for our survival. We review evidence on attention to action, drawing on sets of converging evidence from neuropsychological patients through to studies of the time course and neural locus of action-based cueing of attention in normal observers. We show that the presence of action relations between stimuli helps reduce visual extinction in patients with limited attention to the contralesional side of space, while the first saccades made by normal observers and early perceptual and attentional responses measured using electroencephalography/event-related potentials are modulated by preparation of action and by seeing objects being grasped correctly or incorrectly for action. With both normal observers and patients, there is evidence for two components to these effects based on both visual perceptual and motor-based responses. While the perceptual responses reflect factors such as the visual familiarity of the action-related information, the motor response component is determined by factors such as the alignment of the objects with the observer's effectors and not by the visual familiarity of the stimuli. In addition to this, we suggest that action relations between stimuli can be coded pre-attentively, in the absence of attention to the stimulus, and action relations cue perceptual and motor responses rapidly and automatically. At present, formal theories of visual attention are not set up to account for these action-related effects; we suggest ways that theories could be expected to enable action effects to be incorporated.
Stimulus (psychology)
Perceptual system
Cite
Citations (16)
Visual Search
Stimulus (psychology)
Singleton
N2pc
Cite
Citations (6)
Stimulus (psychology)
Visual Search
Cite
Citations (0)
Affect
Cite
Citations (1)
Audio-visual integration interacts with attentional mechanisms. Additionally, salient auditory stimuli automatically draw attention to an audio-visual event, while spatial attention can modulate audio-visual integration. Attention induced by auditory inputs (sound-driven attention) facilitates visual perception. Similarly, visual attention improves performance on a visual task. However, the difference between attention driven by auditory and visual cues is not clear. When visual attention facilitates visual perception, there is a trade-off between spatial and temporal resolution. In contrast, audition has superior temporal resolution to vision. In the present study, we investigated the difference between auditory and visual cue-driven attention with respect to this trade-off. The results indicated that visual cueing increased spatial resolution but decreased temporal resolution. On the other hand, auditory cueing affected the efficiency of visual processing (i.e., response time) for temporal gap detection. These findings suggest that auditory cueing capitalizes on resources available for visual processing. In contrast, visual cueing may increase activation of the spatial channel instead of inhibiting the temporal channel, as proposed in previous study. Overall, there appear to be clear differences between mechanisms involved in auditory and visual cues-driven attention.
N2pc
Visual spatial attention
Auditory perception
Visual Search
Visual processing
Gaze-contingency paradigm
Multisensory Integration
Cite
Citations (0)
Visual processing
Multisensory Integration
Stimulus (psychology)
Cite
Citations (4)
Facilitation
Cite
Citations (0)
Failure to detect changes to salient visual input across a brief interval has popularized the use of change detection, a paradigm that plays important roles in recent studies of visual perception, short-term memory, and consciousness. Much research has focused on the nature of visual representation for the pre- and postchange displays, yet little is known about how visual change detection is interfered with by events inserted between the pre- and postchange displays. To address this question, we tested change detection of colors, spatial locations, and natural scenes, when the interval between changes was (1) blank, (2) filled with a visual scene, or (3) filled with an auditory word. Participants were asked to either ignore the filled visual or auditory event or attend to it by categorizing it as animate or inanimate. Results showed that the ability to detect visual changes was dramatically impaired by attending to a secondary task during the delay. This interference was significant for auditory as well as for visual interfering events and was invariant to the complexity of the prechange displays. Passive listening produced no interference, whereas passive viewing produced small but significant interference. We conclude that visual change detection relies significantly on central, amodal attention.
Amodal perception
Change blindness
N2pc
Cite
Citations (56)
In sensorimotor integration, the brain needs to decide how its predictions should accommodate novel evidence by 'gating' sensory data depending on the current context. Here, we examined the oscillatory correlates of this process by recording magnetoencephalography (MEG) data during a new task requiring action under intersensory conflict. We used virtual reality to decouple visual (virtual) and proprioceptive (real) hand postures during a task in which the phase of grasping movements tracked a target (in either modality). Thus, we rendered visual information either task-relevant or a (to-be-ignored) distractor. Under visuo-proprioceptive incongruence, occipital beta power decreased (relative to congruence) when vision was task-relevant but increased when it had to be ignored. Dynamic causal modeling (DCM) revealed that this interaction was best explained by diametrical, task-dependent changes in visual gain. These results suggest a crucial role for beta oscillations in the contextual gating (i.e., gain or precision control) of visual vs proprioceptive action feedback, depending on current behavioral demands.
BETA (programming language)
Visual feedback
Beta Rhythm
Cite
Citations (29)