Emotion Expression of Avatar through Eye Behaviors, Lip Synchronization and MPEG4 in Virtual Reality based on Xface Toolkit: Present and Future☆

2013 
Abstract Eye movement combined with lip synchronization, eye movements, and emotional facial expression revealed an interesting research field that gives information about verbal and nonverbal behaviors occurring in the human body. Most of the previous researchers focused on eyes gazes, lip synching and emotion expression which are the most important features that can transfer nonverbal information to enhance, understand or express emotion. In this paper, the recent advances in 3D facial expression are introduced focusing on the presentation of Xface platform toolkit that developed a 3D talking avatars synthesis by implementing text-to-speech engine (TTS) to depict the basic lip shapes necessary for each phonemes to convey the dialogue. This work is believed to give the future direction that can lead into new research issue in facial animation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    20
    References
    2
    Citations
    NaN
    KQI
    []