Affective State Analysis Through Visual and Thermal Image Sequences

2021 
This paper presents a contactless system based on twin channels of thermal and visual image sequences to register the affective states of an individual during Human–Computer Interaction (HCI). The negative affective states such as stress, anxiety, and depression in students have raised significant concerns. The first phase obtains the dominant emotional state by an ensemble of cues from visual and thermal facial images using a newly proposed cascaded Convolutional Neural Network (CNN) model named as EmoScale. The second phase clusters a sequence of the obtained emotional states using a trained Hidden Markov model (HMM) as one of the three affective states anxiety, depression, and stress. We perform fivefold cross-validation of EmoScale on our self-prepared dataset. The performance of the second phase is compared with a standard Depression Anxiety Stress Scale (DASS) and the results are found to be promising.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    0
    Citations
    NaN
    KQI
    []