A basic study on dynamic control of facial expressions for Face Robot

1994 
In order to develop an active human interface that realizes "hear-to-heart" virtual communication between an intelligent machine and human being, we have already reported the "Face Robot" which has a human-like face and can display facial expressions similar to that of a human being by using a flexible microactuator (FMA). For realizing real-time communication between intelligent machine and human being, the Face Robot must express its facial expressions at the almost same speed and in the same manner as a human being. However it is found that FMA can not cope with this kind of performance in expressing dynamic facial features. This paper deals with the development of new mini-actuator "ACDIS" for real-time display of Face Robot's facial expressions and also their control method. The developed double action piston type actuator is able to measure the displacement of the position in ACDIS by equipping a LED and a photo-transistor inside it. The opening time of the electro-magnetic valve is regulated for the displacement control of ACDIS by a PD control algorithm. The ACDIS is found to have sufficient performance in the speed of piston-movement and we undertake the experiment of real-time facial expression on the Face Robot and confirm that the display of human-like facial expression is successfully realized.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    12
    References
    45
    Citations
    NaN
    KQI
    []