logo
    Vocal Synchrony of Robots Boosts Positive Affective Empathy
    2
    Citation
    52
    Reference
    10
    Related Paper
    Citation Trend
    Abstract:
    Robots that can talk with humans play increasingly important roles in society. However, current conversation robots remain unskilled at eliciting empathic feelings in humans. To address this problem, we used a robot that speaks in a voice synchronized with human vocal prosody. We conducted an experiment in which human participants held positive conversations with the robot by reading scenarios under conditions with and without vocal synchronization. We assessed seven subjective responses related to affective empathy (e.g., emotional connection) and measured the physiological emotional responses using facial electromyography from the corrugator supercilii and zygomatic major muscles as well as the skin conductance level. The subjective ratings consistently revealed heightened empathic responses to the robot in the synchronization condition compared with that under the de-synchronizing condition. The physiological signals showed that more positive and stronger emotional arousal responses to the robot with synchronization. These findings suggest that robots that are able to vocally synchronize with humans can elicit empathic emotional responses.
    Keywords:
    Facial electromyography
    Facial expression is an integral aspect of non-verbal communication of affective information. Earlier psychological studies have reported that the presentation of prerecorded photographs or videos of emotional facial expressions automatically elicits divergent responses, such as emotions and facial mimicry. However, such highly controlled experimental procedures may lack the vividness of real-life social interactions. This study incorporated a live image relay system that delivered models' real-time performance of positive (smiling) and negative (frowning) dynamic facial expressions or their prerecorded videos to participants. We measured subjective ratings of valence and arousal and facial electromyography (EMG) activity in the zygomaticus major and corrugator supercilii muscles. Subjective ratings showed that the live facial expressions were rated to elicit higher valence and more arousing than the corresponding videos for positive emotion conditions. Facial EMG data showed that compared with the video, live facial expressions more effectively elicited facial muscular activity congruent with the models' positive facial expressions. The findings indicate that emotional facial expressions in live social interactions are more evocative of emotional reactions and facial mimicry than earlier experimental data have suggested.
    Facial electromyography
    Facial muscles
    Emotional expression
    Emotional valence
    Citations (15)
    Human faces express emotions, informing others about their affective states. In order to measure expressions of emotion, facial Electromyography (EMG) has widely been used, requiring electrodes and technical equipment. More recently, emotion recognition software has been developed that detects emotions from video recordings of human faces. However, its validity and comparability to EMG measures is unclear. The aim of the current study was to compare the Affectiva Affdex emotion recognition software by iMotions with EMG measurements of the zygomaticus mayor and corrugator supercilii muscle, concerning its ability to identify happy, angry and neutral faces. Twenty participants imitated these facial expressions while videos and EMG were recorded. Happy and angry expressions were detected by both the software and by EMG above chance, while neutral expressions were more often falsely identified as negative by EMG compared to the software. Overall, EMG and software values correlated highly. In conclusion, Affectiva Affdex software can identify emotions and its results are comparable to EMG findings.
    Facial electromyography
    Facial muscles
    Comparability
    Negative emotion
    Citations (7)
    An abundance of studies on emotional experiences in response to music have been published over the past decades, however, most have been carried out in controlled laboratory settings and rely on subjective reports. Facial expressions have been occasionally assessed but measured using intrusive methods such as facial electromyography (fEMG). The present study investigated emotional experiences of fifty participants in a live concert. Our aims were to explore whether automated face analysis could detect facial expressions of emotion in a group of people in an ecologically valid listening context, to determine whether emotions expressed by the music predicted specific facial expressions and examine whether facial expressions of emotion could be used to predict subjective ratings of pleasantness and activation. During the concert, participants were filmed and facial expressions were subsequently analyzed with automated face analysis software. Self-report on participants' subjective experience of pleasantness and activation were collected after the concert for all pieces (two happy, two sad). Our results show that the pieces that expressed sadness resulted in more facial expressions of sadness (compared to happiness), whereas the pieces that expressed happiness resulted in more facial expressions of happiness (compared to sadness). Differences for other facial expression categories (anger, fear, surprise, disgust, and neutral) were not found. Independent of the musical piece or emotion expressed in the music facial expressions of happiness predicted ratings of subjectively felt pleasantness, whilst facial expressions of sadness and disgust predicted low and high ratings of subjectively felt activation, respectively. Together, our results show that non-invasive measurements of audience facial expressions in a naturalistic concert setting are indicative of emotions expressed by the music, and the subjective experiences of the audience members themselves.
    Sadness
    Disgust
    Facial electromyography
    Emotional expression
    Facial muscles
    Surprise
    Citations (18)
    Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a "reactivation" of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG) - in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.
    Facial electromyography
    Emotion Perception
    Facial muscles
    Emotional expression
    Citations (136)
    Empathy is the ability to understand and to share another’s emotional state, and is seen as an important social skill to maintain relationships, to and to inhibit aggression. Yet, there has been little longitudinal research investigating the development of empathy, especially when it comes to adolescence, which is a formative period for empathy. Moreover, although empathy is generally seen as a multi-dimensional construct, involving affective, cognitive and motor processes, and encompassing trait empathy (i.e., the general tendency to empathize with others) and state empathy (i.e., empathy as it occurs in specific situations), previous research has mainly focused on self-reported trait empathy. Therefore little is known about how these different processes are interrelated. The aim of this dissertation was to address relations between the dimensions of empathy, to extend our understanding of empathy development in adolescence, to investigate the role of empathy in social behavior, and to examine why some adolescents are more empathic than others. The first study examined the relations between empathy dimensions, and revealed that adolescents who showed stronger motor empathy in response to happiness or sadness (as measured with use of facial electromyography), consistently experienced higher affective state empathy, and indirectly, showed higher levels of cognitive state empathy. Remarkably, the study revealed no strong support for close links between adolescents’ trait and state empathy. The second study investigated the development of cognitive and affective trait empathy across adolescence. Interestingly, girls not only showed higher levels of cognitive and affective empathy than boys did, but also the developmental patterns showed striking gender differences. In addition, pubertal status was associated with boys affective empathy; boys who were physically more mature reported lower affective empathy compared with their physically less mature peers. The third study addressed whether adolescents high in affective empathy are more responsive to parental support than adolescents low in affective empathy. Interestingly, affective empathy indeed moderated the relation of parental support with both aggression and delinquency. Adolescents high in empathy benefited more from parental support than adolescents low in empathy. Remarkably, adolescents low in empathy not only benefited less from support, but even showed more aggression and delinquency when they experienced more parental support. The fourth study addressed the role of resting respiratory sinus arrhythmia (RSA) in adolescents’ affective empathy and externalizing behavior. RSA appeared to moderate the association between parent-adolescent relationship quality and adolescents’ social functioning. For instance, parental support was a positive predictor of affective empathy for girls high in RSA, whereas the association was non-significant for girls low in RSA. Further, for boys with high RSA more negative interaction with parents predicted lower affective empathy, but remarkably, for boys low in RSA more negative interaction predicted more affective empathy. To conclude, this dissertation addressed important issues regarding adolescents’ empathy that have received little attention in previous research. The current research was the first to longitudinally investigate empathy development across adolescence, and to thoroughly address associations between empathy dimensions. The results highlight adolescence as a period of change in empathic tendencies, but call for future research to disentangle the processes that may underlie these changes. Moreover, results of this dissertation illustrate the usefulness of applying a multi-method approach in research on adolescents’ empathy.
    Facial electromyography
    Trait
    Sadness
    Citations (0)
    Abstract Facial muscular reactions to avatars' static (neutral, happy, angry) and dynamic (morphs developing from neutral to happy or angry) facial expressions, presented for 1 s each, were investigated in 48 participants. Dynamic expressions led to better recognition rates and higher intensity and realism ratings. Angry expressions were rated as more intense than happy expressions. EMG recordings indicated emotion‐specific reactions to happy avatars as reflected in increased M. zygomaticus major and decreased M. corrugator supercilii tension, with stronger reactions to dynamic as compared to static expressions. Although rated as more intense, angry expressions elicited no significant M. corrugator supercilii activation. We conclude that facial reactions to angry and to happy facial expressions hold different functions in social interactions. Further research should vary dynamics in different ways and also include additional emotional expressions.
    Facial electromyography
    Facial muscles
    Avatar
    Emotional expression
    When viewing a face expressing emotion, the viewer's face mimics the same emotion. It is unknown whether such facial mimicry takes place when the viewed emotion is a task irrelevant property of the face. The present experiment addressed this question by asking participants to judge either the emotional expression or the colour of a series of happy and angry faces that were either blue or yellow. Electromyographical recordings showed that when emotion was ignored, there was a tendency for facial muscle activity to be suppressed. Nonetheless, participants’ facial expressions mimicked target expressions, with the zygomaticus cheek muscle being more active when viewing a smiling face and the corrugator brow muscle more active when viewing an angry face. These data support the automatic encoding of irrelevant emotional information, as well as suppression of emotional information by selective attention.
    Facial electromyography
    Facial muscles
    Emotional expression
    Relevance
    Citations (38)
    Humans rapidly and spontaneously activate muscles in the face when viewing emotional facial expressions in others. These rapid facial reactions (RFRs) are thought to reflect low-level, bottom-up processes, and are theorized to assist an observer to experience and share the affect of another individual. It has been assumed that RFRs are present from birth; however to date, no study has investigated this response in children younger than 3 years of age. In the present study, we used facial electromyography (EMG) to measure corrugator supercilii (brow) and zygomaticus major (cheek) muscle activity in 7-month-old infants while they viewed happy and angry facial expressions. The results showed that 7-month olds exhibited greater zygomaticus activity in response to happy expressions than angry expressions, however, we found no evidence of differential corrugator muscle activity.
    Facial electromyography
    Facial muscles
    Affect
    Citations (26)
    Observing facial expressions automatically prompts imitation, as can be seen with facial electromyography. To investigate whether this reaction is driven by automatic mimicry or by recognition of the emotion displayed we recorded electromyograph responses to presentations of facial expressions, face–voice combinations and bodily expressions, which resulted from happy and fearful stimuli. We observed emotion-specific facial muscle activity (zygomaticus for happiness, corrugator for fear) for all three stimulus categories. This indicates that spontaneous facial expression is more akin to an emotional reaction than to facial mimicry and imitation of the seen face stimulus. We suggest that seeing a facial expression, an emotional body expression or hearing an emotional tone of voice all activate the affect program corresponding to the emotion displayed.
    Facial electromyography
    Stimulus (psychology)
    Facial muscles
    Emotional expression
    Citations (121)