Induction and Evaluation of Affects for Facial Motion Capture

2007 
In this study, we are interested in capturing the facial configuration of Affects in order to use them for Embodied Conversational Agents. In order to create a believable eca , it is necessary to capture natural Affects that can be learnt and replayed. However, until now, animation data are extracted from videos and their description is far from being sufficient to generate realistic facial expressions. It seems that believable results cannot be obtained without using 3D motion capture. This is why in this study we tried to set up a protocol for Affects induction in a motion capture situation with manipulated subjects who are unaware of the real goals. Similarly from [1], we induce natural Affects in order to capture the related facial expressions.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    4
    References
    0
    Citations
    NaN
    KQI
    []