Measuring Human Emotion in Short Documents to Improve Social Robot and Agent Interactions

2019 
Social robots and agents can interact with people better if they can infer their affective state (emotions). While they cannot yet recognise affective state from tone and body language, they can use the fragments of speech that they (over)hear. We show that emotions – as conventionally framed – are difficult to detect. We suggest, from empirical results, that this is because emotions are the wrong granularity; and that emotions contain subemotions that are much more clearly separated from one another, and so are both easier to detect and to exploit.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    5
    References
    0
    Citations
    NaN
    KQI
    []