Comparison of Facial Emotion Recognition Based on Image Visual Features and EEG Features.

2018 
Automatic facial emotion recognition plays an important role in human-computer interaction. Although humans can recognize emotions with little or no effort, reliable emotion recognition by machines is always a challenge. To explore the doubt that whether machines can discriminate better than humans or not, we proposed two different ideas about facial emotion recognition. One is based on image visual features, the other is based on EEG signals which were recorded when the subject is watching facial emotion pictures. Correspondingly, the Deep Convolutional Neural Network (DCNN) model is adopted to enable the machine to learn visual features from facial emotion pictures automatically. The Gated Recurrent Unit (GRU) model is used to extract specific emotional EEG features from EEG signals. These two methods were verified on the Chinese Facial Affective Picture System (CFAPS) and our developed Emotion EEG data (EMOT), and the recognition performance based on EEG features was found to be significantly better than the image visual.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    21
    References
    0
    Citations
    NaN
    KQI
    []