Navigation of a humanoid robot via head gestures based on global and local live videos on Google Glass

2017 
The navigation of a mobile robot is a central problem in robotics research. A smart wearable device called Google Glass provides a new way to achieve the human and machine interaction. This paper presents a navigation strategy for NAO humanoid robot via head gestures based on global and local live videos displayed on a Google Glass. We develop a module to establish connection from the Google Glass to the robot and detect head gestures by fusing multi-sensor data through a complementary filter to eliminate drift of the head gesture reference. We conduct an obstacle avoidance task to validate the effectiveness of the control system. An operator wearing Google Glass was able to navigate the robot smoothly.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    10
    References
    1
    Citations
    NaN
    KQI
    []