A thermal emotion classifier for improved human-robot interaction

2016 
In their expanding role as tutors, home and healthcare assistants, robots must effectively interact with individuals of varying ability and temperament. Indeed, deploying robots in long-term social engagements will almost certainly require robots to reliably detect and adapt to changes in the demeanor of social partners to promote trust and more productive collaboration. However, the recognition of emotional state typically relies on the interpretation of very subtle cues, often varying from one person to the next. In addition, while facial expressions, body posture and features of speech have been used to detect affective changes, the robustness of these measures is often hindered by cultural and age differences. Recently, infrared thermography has shown promise in detecting guilt, fear and stress, indicating that it may be a viable sensing modality for improved human-robot interaction. In this study, we evaluated the efficacy of using a far infrared (FIR) camera for detecting robot-elicited affective response compared to video-elicited affective response by tracking thermal changes in five areas of the face. Further, we analyzed localized changes in the face to assess whether thermal and electrodermal responses to emotions elicited by traditional video techniques and by robots are similar. Finally, we performed principal component analysis to reduce the dimensionality of data and evaluated the performance using machine learning techniques for classifying thermal data by emotion state, resulting in a thermal classifier with a performance accuracy of 77.5%.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    10
    Citations
    NaN
    KQI
    []