Classification of Hand Movement Stages for Brain–Computer Interface Using Convolutional Neural Network

2019 
In this paper, a deep learning based control algorithm for a fully functional, real-time prosthetic limb is presented to provide motor rehabilitation and restoration. The proposed scheme detects six different events related to hand movement during a task of grasping and lifting an object, using electroencephalographs (EEG). These six events represent the six sequential stages of a grasp-and-lift action like hand starts moving, starts lifting the object. This method mainly aims at increasing the classification performance of BCI applications by incorporating convolutional neural network (CNN) and low-pass filtering based learning approaches on electroencephalography recordings.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    17
    References
    0
    Citations
    NaN
    KQI
    []