Beat gesture recognition and finger motion control of a piano playing robot for affective interaction of the elderly

2008 
This paper introduces a piano playing robot in views of smart house and assistive robot technology to care the affective states of the elderly. We address the current issues in this research area and propose a piano playing robot as a solution. For affective interaction based on music, we first present a beat gesture recognition method to synchronize the tempo of a robot playing a piano with the desired tempo of the user. To estimate the period of an unstructured beat gesture expressed by any part of a body or an object, we apply an optical flow method, and use the trajectories of the center of gravity and normalized central moments of moving objects in images. In addition, we also apply a motion control method by which robotic fingers are trained to follow a set of trajectories. Since the ability to track the trajectories influences the sound a piano generates, we adopt an iterative learning control method to reduce the tracking error.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    23
    References
    5
    Citations
    NaN
    KQI
    []