Development of a wearable HCI controller through sEMG & IMU sensor fusion

2016 
This paper studies a novel wearable human-computer interface that allows a user to interact with computer-based applications through the fusion of sEMG and IMU sensors. The proposed system is able to detect human motion intention, specifically, the gestures of the wrist and hand. It then translates the gestures and transmits it as command signals for computer-based applications. The novelty of the proposed system is the training-free control scheme to decode sEMG signals into target motions. The classified gestures could be identified by two sEMG sensors mounted on a forearm. One IMU sensor is used to calculate the real-time arm configuration. This can be a command signal for a cursor position in a computer-based application through the proposed projection method. Our method also comprises the drift compensation algorithm which makes our system more robust in prolonged operation. It also makes a user feel more comfortable. For evaluating the applicability of the proposed method, we developed a presentation controller that allows the user to control the mouse cursor, and three distinctive commands using wrist and hand gestures. The proposed system is validated by experiments with six subjects.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    6
    Citations
    NaN
    KQI
    []