Real-time facial character animation

2015 
This demonstration paper presents a real-time facial character animation application where the facial expressions of a person are simultaneously synthesized on a virtual avatar. The proposed method does not require any training or calibration for the person interacting with the system. An Active Appearance Model based technique is used to track more than 500 points on the face to create the animated expression of the virtual avatar. The sex, age or ethnicity of the subject in front of the camera can also be automatically analyzed and hence the visualization of the avatar could be adapted accordingly. This application requires a standard web cam and is intended for gaming, entertainment or video conference purposes and will be presented in a real-time setup during the demo session.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    3
    References
    2
    Citations
    NaN
    KQI
    []