Continuous Facial Emotion Recognition System Using PCA for Ambient Living

2019 
Nowadays, Facial Emotion Recognition is widely used and is an attractive area in affective computing especially for computer vision with healthcare applications. Facial expressions change with respect to time and person in different instances. To find out the emotions automatically by computers, facial expressions perform the most important role and also aid for human–machine interfaces. Persons can be distinguished by facial expressions easily on time but for computers, it is still a challenge. Presented work proposes the emergence-based eigenface techniques. By using PCA (Principal Component Analysis), we can extract all relevant information present in frames where human faces are detected. We know that facial expressions are conveying emotions exactly. We use PCA to reduce the dimensionality of computations. In this process we are detecting face, extracting features, reducing dimensionality using PCA, and then classifying emotions using Euclidean distance metric and after that, we apply temporal dynamics (Patthe and Anil in Temporal dynamics of continuous facial emotion recognition system, 2017) for redundant frames with emotions reduction. Eigenvectors are calculated by the set of training images, which defines the face spaces. We apply PCA for compressing eight orientations and the relevant scale of frames. In PCA, we used the database in which some frames are used for the training purpose. Rest of the frames are used for testing propose. We used training frames for emotions such as angry, disgust, happy, neutral, and surprise. We experimented on Indian Face Database. From this database, 30 frames are used for training the system and 50 frames are used for testing purpose. Through experimentation, we obtained a recognition rate which is 91.26%.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    3
    References
    1
    Citations
    NaN
    KQI
    []