Visual motion processing plays a key role in enabling primates' successful interaction with their dynamic environments. Although in natural environments the speed of visual stimuli continuously varies, speed tuning of neurons in the prototypical motion area MT has traditionally been assessed with stimuli that moved at constant speeds. We investigated whether the representation of speed in a continuously varying stimulus context differs from the representation of constant speeds. We recorded from individual MT neurons of fixating macaques while stimuli moved either at a constant speed or in a linearly accelerating or decelerating manner. We found clear speed tuning even when the stimulus consisted of visual motion with gradual speed changes. There were, however, important differences with the speed tuning as measured with constant stimuli: the stimulus context affected neuronal preferred speed as well as the associated tuning width of the speed tuning curves. These acceleration-dependent changes in response lead to an accurate representation of the acceleration of these stimuli in the MT cells. To elucidate the mechanistic basis of this signal, we constructed a stochastic firing rate model based on the constant speed response profiles. This model incorporated each cell's speed tuning and response adaptation dynamics and accurately predicted the response to constant speeds as well as accelerating and decelerating stimuli. Because the response of the model neurons had no explicit acceleration dependence, we conclude that speed-dependent adaptation creates a strong influence of temporal context on the MT response and thereby results in the representation of acceleration signals.
The patterns of optic flow seen during self-motion can be used to determine the direction of one's own heading. Tracking eye movements which typically occur during everyday life alter this task since they add further retinal image motion and (predictably) distort the retinal flow pattern. Humans employ both visual and nonvisual (extraretinal) information to solve a heading task in such case. Likewise, it has been shown that neurons in the monkey medial superior temporal area (area MST) use both signals during the processing of self-motion information. In this article we report that neurons in the macaque ventral intraparietal area (area VIP) use visual information derived from the distorted flow patterns to encode heading during (simulated) eye movements. We recorded responses of VIP neurons to simple radial flow fields and to distorted flow fields that simulated self-motion plus eye movements. In 59% of the cases, cell responses compensated for the distortion and kept the same heading selectivity irrespective of different simulated eye movements. In addition, response modulations during real compared with simulated eye movements were smaller, being consistent with reafferent signaling involved in the processing of the visual consequences of eye movements in area VIP. We conclude that the motion selectivities found in area VIP, like those in area MST, provide a way to successfully analyze and use flow fields during self-motion and simultaneous tracking movements.
We have recently shown that stimulus acceleration affects subsequent preferred speed and tuning widths of macaque area MT neurons (A. Schlack, B. Krekelberg, & T. D. Albright, 2007). Given the close link between area MT and speed perception, this predicts that speed perception should depend on the acceleration context. Here, we show that this is indeed the case for both speed discrimination and speed perception. Specifically, speed discrimination thresholds improve in an acceleration context but absolute speeds are more underestimated than in a deceleration context. In line with our physiological data, these changes can be understood in terms of speed-dependent adaptation mechanisms in MT and do not require an explicit acceleration dependence of speed perception.
Abstract Navigation in space requires the brain to combine information arising from different sensory modalities with the appropriate motor commands. Sensory information about self‐motion in particular is provided by the visual and the vestibular system. The macaque ventral intraparietal area (VIP) has recently been shown to be involved in the processing of self‐motion information provided by optical flow, to contain multimodal neurons and to receive input from areas involved in the analysis of vestibular information. By studying responses to linear vestibular, visual and bimodal stimulation we aimed at gaining more insight into the mechanisms involved in multimodal integration and self‐motion processing. A large proportion of cells (77%) revealed a significant response to passive linear translation of the monkey. Of these cells, 59% encoded information about the direction of self‐motion. The phase relationship between vestibular stimulation and neuronal responses covered a broad spectrum, demonstrating the complexity of the spatio‐temporal pattern of vestibular information encoded by neurons in area VIP. For 53% of the direction‐selective neurons the preferred directions for stimuli of both modalities were the same; they were opposite for the remaining 47% of the neurons. During bimodal stimulation the responses of neurons with opposite direction selectivity in the two modalities were determined either by the visual (53%) or the vestibular (47%) modality. These heterogeneous responses to unimodal and bimodal stimulation might be used to prevent misjudgements about self‐ and/or object‐motion, which could be caused by relying on information of one sensory modality alone.
Navigation through the environment requires the brain to process a number of incoming sensory signals, such as visual optical flow on the retina and motion information originating from the vestibular organs. In addition, tactile as well as auditory signals can help to disambiguate the continuous stream of incoming information and determining the signals resulting from one's own set of motion. In this review I will focus on the cortical processing of motion information in one subregion of the posterior parietal cortex, i.e., the ventral intraparietal area (VIP). I will review (1) electrophysiological data from single cell recordings in the awake macaque showing how self-motion signals across different sensory modalities are represented within this area and (2) data from fMRI recordings in normal human subjects providing evidence for the existence of a functionally equivalent area of macaque area VIP in the human cortex.
Animals can use different sensory signals to localize objects in the environment. Depending on the situation, the brain either integrates information from multiple sensory sources or it chooses the modality conveying the most reliable information to direct behavior. This suggests that somehow, the brain has access to a modality-invariant representation of external space. Accordingly, neural structures encoding signals from more than one sensory modality are best suited for spatial information processing. In primates, the posterior parietal cortex (PPC) is a key structure for spatial representations. One substructure within human and macaque PPC is the ventral intraparietal area (VIP), known to represent visual, vestibular, and tactile signals. In the present study, we show for the first time that macaque area VIP neurons also respond to auditory stimulation. Interestingly, the strength of the responses to the acoustic stimuli greatly depended on the spatial location of the stimuli [i.e., most of the auditory responsive neurons had surprisingly small spatially restricted auditory receptive fields (RFs)]. Given this finding, we compared the auditory RF locations with the respective visual RF locations of individual area VIP neurons. In the vast majority of neurons, the auditory and visual RFs largely overlapped. Additionally, neurons with well aligned visual and auditory receptive fields tended to encode multisensory space in a common reference frame. This suggests that area VIP constitutes a part of a neuronal circuit involved in the computation of a modality-invariant representation of external space.
In natural environments the speed of moving objects continually changes. To successfully interact with such objects it is useful to not only focus on the ongoing speed, but to also take speed changes into account. In the present study we were interested in whether area MT - the main motion area in the primate brain - represents not only information of the ongoing speed of a visual motion stimulus but also reflects recent stimulus speed history. We recorded from MT neurons from macaque monkeys during visual stimulation. The stimulus consisted of random dots moving into the preferred direction of the neuron. Stimulus speed changed smoothly over time (either linearly accelerating (condition 1) or decelerating (condition 2). Both conditions contained the same actual speeds, but the speed history differed. We found that the responses of most MT cells were influenced by the recent speed history of the visual stimulus. One main finding was a change in tuning width: The speed tuning was narrower when the stimulus was accelerating than when it was decelerating. This suggests that the system is less sensitive to speed changes when the stimulus smoothly decelerates. We investigated this in a psychophysical experiment with human subjects by determining detection thresholds to speed changes in a smoothly accelerating or decelerating stimulus. The results confirmed the prediction: during smooth deceleration, sensitivity was lower than during acceleration. In summary we found that the recent stimulus speed influences the speed tuning properties of MT neurons and the speed perception of human subjects. This is further evidence that the visual system does not represent snapshots of the ongoing visual stimulation but integrates information over time. This integration is beneficial for survival in an ever changing environment.