A Visual Sensing Platform for Robot Teachers

2019 
This paper describes our ongoing work to develop a visual sensing platform that can inform a robot teacher about the behaviour and affective state of its student audience. We have developed a multi-student behaviour recognition system, which can detect behaviours such as "listening" to the lecturer, "raising hand", or "sleeping". We have also developed a multi-student affect recognition system which, starting from eight basic emotions detected from facial expressions, can infer higher emotional states relevant to a learning context, such as "interested", "distracted" and "confused". Both systems are being tested with the Softbank robot Pepper that can respond to various students' behaviours and emotional states with adapted movements, postures and speech.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    3
    Citations
    NaN
    KQI
    []