ARCS-Assisted Teaching Robots Based on Anticipatory Computing and Emotional Big Data for Improving Sustainable Learning Efficiency and Motivation

2020 
Under the vigorous development of global anticipatory computing in recent years, there have been numerous applications of artificial intelligence (AI) in people’s daily lives. Learning analytics of big data can assist students, teachers, and school administrators to gain new knowledge and estimate learning information; in turn, the enhanced education contributes to the rapid development of science and technology. Education is sustainable life learning, as well as the most important promoter of science and technology worldwide. In recent years, a large number of anticipatory computing applications based on AI have promoted the training professional AI talent. As a result, this study aims to design a set of interactive robot-assisted teaching for classroom setting to help students overcoming academic difficulties. Teachers, students, and robots in the classroom can interact with each other through the ARCS motivation model in programming. The proposed method can help students to develop the motivation, relevance, and confidence in learning, thus enhancing their learning effectiveness. The robot, like a teaching assistant, can help students solving problems in the classroom by answering questions and evaluating students’ answers in natural and responsive interactions. The natural interactive responses of the robot are achieved through the use of a database of emotional big data (Google facial expression comparison dataset). The robot is loaded with an emotion recognition system to assess the moods of the students through their expressions and sounds, and then offer corresponding emotional responses. The robot is able to communicate naturally with the students, thereby attracting their attention, triggering their learning motivation, and improving their learning effectiveness.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    21
    References
    3
    Citations
    NaN
    KQI
    []