Cognitive Architecture to Composite Emotions from Autonomic Nervous System for Robotic Head

2013 
In this paper, methods are proposed for generating facial expression defined by Ekman’s facial expression database from a robotic head. This paper gives rise to how to generate two emotions at the same time and how to solve problems between two conflicting emotions. Among others, it uses a cognitive architecture and some biological values in generating several probabilities from the hidden Markov model. The cognitive architecture is characterized by three main parts: perception and decoding module for simulating brain system, thalamus, amygdala, hypothalamus and hippocampus; learning module to memorize the data and transmit the behavior of an emotion to the emotion module which will generate facial expression to the robotic head from a given environment. This paper presents two basic emotions, not only one as we have already tested. However, according to psychologists, some conflicts exist between emotions. This research develops a table of compatibility of emotions from the Robert Plutchik’s research, who is one of the precursors of emotional intelligence. To use Robert Plutchik’s Emotion database and Paul Ekman Emotion database, we decide to work with the 6 universal basic emotions, anger, disgust, fear, joy, sadness and surprise, which can be combined. The problem of conflicting composite emotion can be solved by using the properties of the biological stimuli. By this way, the composite emotion is made possible.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    5
    References
    0
    Citations
    NaN
    KQI
    []