Face perception in humans and nonhuman primates is accomplished by a patchwork of specialized cortical regions. How these regions develop has remained controversial. In sighted individuals, facial information is primarily conveyed via the visual modality. Early blind individuals, on the other hand, can recognize shapes using auditory and tactile cues. Here we demonstrate that such individuals can learn to distinguish faces from houses and other shapes by using a sensory substitution device (SSD) presenting schematic faces as sound-encoded stimuli in the auditory modality. Using functional MRI, we then asked whether a face-selective brain region like the fusiform face area (FFA) shows selectivity for faces in the same subjects, and indeed, we found evidence for preferential activation of the left FFA by sound-encoded faces. These results imply that FFA development does not depend on experience with visual faces per se but may instead depend on exposure to the geometry of facial configurations.
Sighted individuals are less accurate and slower to localize sounds coming from the peripheral space than sounds coming from the frontal space. This specific bias in favour of the frontal auditory space seems reduced in early blind individuals, who are particularly better than sighted individuals at localizing sounds coming from the peripheral space. Currently, it is not clear to what extent this bias in the auditory space is a general phenomenon or if it applies only to spatial processing (i.e. sound localization). In our approach we compared the performance of early blind participants with that of sighted subjects during a frequency discrimination task with sounds originating either from frontal or peripheral locations. Results showed that early blind participants discriminated faster than sighted subjects both peripheral and frontal sounds. In addition, sighted subjects were faster at discriminating frontal sounds than peripheral ones, whereas early blind participants showed equal discrimination speed for frontal and peripheral sounds. We conclude that the spatial bias observed in sighted subjects reflects an unbalance in the spatial distribution of auditory attention resources that is induced by visual experience.
The segregation between cortical pathways for the identification and localization of objects is thought of as a general organizational principle in the brain. Yet, little is known about the unimodal versus multimodal nature of these processing streams. The main purpose of the present study was to test whether the auditory and tactile dual pathways converged into specialized multisensory brain areas. We used functional magnetic resonance imaging (fMRI) to compare directly in the same subjects the brain activation related to localization and identification of comparable auditory and vibrotactile stimuli. Results indicate that the right inferior frontal gyrus (IFG) and both left and right insula were more activated during identification conditions than during localization in both touch and audition. The reverse dissociation was found for the left and right inferior parietal lobules (IPL), the left superior parietal lobule (SPL) and the right precuneus-SPL, which were all more activated during localization conditions in the two modalities. We propose that specialized areas in the right IFG and the left and right insula are multisensory operators for the processing of stimulus identity whereas parts of the left and right IPL and SPL are specialized for the processing of spatial attributes independently of sensory modality.
Since we recently showed in behavioural tasks that the top-down cognitive control was specifically altered in tinnitus sufferers, here we wanted to establish the link between this impaired executive function and brain alterations in the frontal cortex in tinnitus patients.Using functional magnetic resonance imaging (fMRI), we monitored the brain activity changes in sixteen tinnitus patients (TP) and their control subjects (CS) while they were performing a spatial Stroop task, both in audition and vision.We observed that TP differed from CS in their functional recruitment of the dorsolateral prefrontal cortex (dlPFC, BA46), the cingulate gyrus and the ventromedial prefrontal cortex (vmPFC, BA10). This recruitment was higher during interference conditions in tinnitus participants than in controls, whatever the sensory modality. Furthermore, the brain activity level in the right dlPFC and vmPFC correlated with the performance in the Stroop task in TP.Due to the direct link between poor executive functions and prefrontal cortex alterations in TP, we postulate that a lack of inhibitory modulation following an impaired top-down cognitive control may maintain tinnitus by hampering habituation mechanisms. This deficit in executive functions caused by prefrontal cortex alterations would be a key-factor in the generation and persistence of tinnitus.
Tinnitus is the perception of sound in the absence of external stimulus. Currently, the pathophysiology of tinnitus is not fully understood, but recent studies indicate that alterations in the brain involve non-auditory areas, including the prefrontal cortex. In experiment 1, we used a go/no-go paradigm to evaluate the target detection speed and the inhibitory control in tinnitus participants (TP) and control subjects (CS), both in unimodal and bimodal conditions in the auditory and visual modalities. We also tested whether the sound frequency used for target and distractors affected the performance. We observed that TP were slower and made more false alarms than CS in all unimodal auditory conditions. TP were also slower than CS in the bimodal conditions. In addition, when comparing the response times in bimodal and auditory unimodal conditions, the expected gain in bimodal conditions was present in CS, but not in TP when tinnitus-matched frequency sounds were used as targets. In experiment 2, we tested the sensitivity to cross-modal interference in TP during auditory and visual go/no-go tasks where each stimulus was preceded by an irrelevant pre-stimulus in the untested modality (e.g. high frequency auditory pre-stimulus in visual no/no-go condition). We observed that TP had longer response times than CS and made more false alarms in all conditions. In addition, the highest false alarm rate occurred in TP when tinnitus-matched/high frequency sounds were used as pre-stimulus. We conclude that the inhibitory control is altered in TP and that TP are abnormally sensitive to cross-modal interference, reflecting difficulties to ignore irrelevant stimuli. The fact that the strongest interference effect was caused by tinnitus-like auditory stimulation is consistent with the hypothesis according to which such stimulations generate emotional responses that affect cognitive processing in TP. We postulate that executive functions deficits play a key-role in the perception and maintenance of tinnitus.
Previous neuroimaging studies identified multimodal brain areas in the visual cortex that are specialized in the processing of specific information, such as visual-haptic recognition of objects. Here we test whether visual brain areas are involved in depth perception when auditory substitution of vision is used. Seven early blind subjects (EB) and nine blindfolded sighted volunteers (BS) were trained to use a prosthesis substituting vision with audition (PSVA) to recognize two-dimensional figures. They were also taught some pictorial monocular depth cues during an object distance estimation task with the prosthesis in a real three-dimensional environment. Using positron emission tomography, regional cerebral blood flow was assessed during exploration of virtual 3D images with the prosthesis while focusing either on 2D features (target search) or on depth (target distance comparison). Results in sighted subjects showed activation in visual association areas for both the target search task, involving the occipito-parietal cortex, and the depth perception condition, which activated occipito-parietal and occipito-temporal areas. This indicates that some brain areas of the visual cortex are relatively multimodal and may be recruited for depth processing via another sense than vision. By contrast in EB subjects the activation patterns during both target search and 3D perception were quite similar and restricted to the dorsal visual stream. The absence of any specific brain activation for depth perception in the EB subjects underlined the crucial role of previous visual experience to get a visual-like depth perception with the PSVA.