Spatio-temporal dynamics of face perception

2019 
The temporal and spatial neural processing of faces have been studied rigorously, but few studies have unified these dimensions to reveal the spatio-temporal dynamics postulated by the models of face processing. We used support vector machine decoding and representational similarity analysis to combine information from different locations (fMRI), timepoints (EEG), and theoretical models. By correlating information matrices derived from pair-wise decodings of neural responses to different facial expressions, we found early EEG timepoints (110 - 150 ms) to match fMRI data from early visual cortex (EVC), and later timepoints (170 - 250 ms) to match data from occipital and fusiform face areas (OFA/FFA) and posterior superior temporal sulcus (pSTS). The earliest correlations were driven by information from happy faces, and the later by more accurate decoding of fearful and angry faces. Model comparisons revealed systematic changes along the processing hierarchy, from emotional distance and visual feature coding in EVC to coding of intensity of expressions in right pSTS. The results highlight the importance of multimodal approach for understanding functional roles of different brain regions.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    65
    References
    0
    Citations
    NaN
    KQI
    []