Tracking Facial Expressions by Using Stereoscopy Video and Back Propagation Neural Network

2013 
In this paper we propose a method to tracking facial expressions. A system with two cameras is used to capture stereoscopic video sequences. The frames are acquired and analyzed by matching two stereoscopic frames through a correlation method that performs image processing to obtain a resulting frame, and then it is processed to recognize a human face by using the Viola and Jones (VJ) method. The face is located via the Nitzberg operator and it provides the feature points of the eyes, eyebrows, nose and mouth, which are introduced into a Backpropagation neural network that is capable of learning and classifying different types of facial expressions that make a person, feel such as: surprised, scared, unhappy, sad, mad and happy. Finally, the result of this process is recognition of facial expressions.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    1
    Citations
    NaN
    KQI
    []