Classification of upper limb movements using convolutional neural network with 3D inception block

2020 
A brain-machine interface (BMI) based on electroencephalography (EEG) can overcome the movement deficits for patients and real-world applications for healthy people. Ideally, the BMI system detects user movement intentions transforms them into a control signal for a robotic arm movement. In this study, we made progress toward user intention decoding and successfully classified six different reaching movements of the right arm in the movement execution (ME). Notably, we designed an experimental environment using robotic arm movement and proposed a convolutional neural network architecture (CNN) with inception block for robust classify executed movements of the same limb. As a result, we confirmed the classification accuracies of six different directions show 0.45 for the executed session. The results proved that the proposed architecture has approximately 6~13% performance increase compared to its conventional classification models. Hence, we demonstrate the 3D inception CNN architecture to contribute to the continuous decoding of ME.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []