Affective Processing of Loved Familiar Faces: Contributions from Electromyography

2012 
Human faces stand among the most unique stimuli in social and emotional communication. By looking at faces we can access a broad range of significant information about the other, such as personal identity, emotional state (facial expression), sex, age, race, attractiveness, attitudes (whether they are friendly or hostile), intentions, and thoughts (Dekowska et al., 2008; Adolphs, 2009). Because of its relevance in everyday interactions, the face has been the target of much study throughout the past decades. Most studies on the psychology of face perception and recognition have focused on emotional facial expressions, following the pioneering work of Tomkins (1962), Izard (1971, 1977, 1994), and Ekman (1984, 1992) (see Russell, 1994; Dimberg, 1997; Whalen et al., 1998; Ohman et al., 2001; Adolphs, 2002; Eimer & Holmes, 2007; Vuilleumier & Pourtois, 2007; Li et al., 2010), based on the evolutionary perspective outlined by Darwin in his book The expression of emotions in man and animals (1872). More recently, a large body of research has been devoted to delineate the brain mechanisms involved in face perception and identity recognition (see Adolphs, 2002; Faihall & Isahi, 2006; Dekowska et al., 2008; Li et al., 2010). In this context, studies using central electrophysiological techniques, such as electroencephalography (EEG), event-related potentials (ERPs), or magnetoencephalography (MEG), and metabolic techniques, such as positron emission tomography (PET) or functional magnetic resonance imaging (fMRI), have proven particularly useful in describing the chain of events taking place in the processing of facial information and the neural circuitry responsible for face recognition, respectively. Thus, several event-related components such as P1, N170, N250r, P300, and N400, have been used to determine the temporal sequence that goes from pictorial/structural encoding to retrieval of biographical/emotional information (Bentin et al., 1996; Bruce & Young, 1986; Eimer, 2000a; Herrmann et al., 2005; Schweinberger, 2011). Imaging techniques, on their part, have helped identify the specific brain areas involved in face perception and recognition, including recognition of emotional expression and personal identity (Adolphs, 2002; Gobbini & Haxby, 2007; Zeki, 2007, Haxby & Gobbini, 2011). These brain areas constitute a distributed network with a core system responsible for the analysis of visual appearance (posterior superior temporal sulcus, inferior occipital and fusiform gyri), and an extended system that underlies the retrieval of person knowledge (medial prefrontal cortex, temporo-parietal junction, anterior temporal cortex, precuneus, and posterior cingulate), action understanding (inferior temporal and frontal operculum, and intraparietal sulcus), and emotion (amygdala, insula, and striatum/reward system).
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    75
    References
    3
    Citations
    NaN
    KQI
    []