Selective Cross-Subject Transfer Learning Based on Riemannian Tangent Space for Motor Imagery Brain-Computer Interface

2021 
A motor imagery (MI) brain-computer interface (BCI) plays an important role in the neurological rehabilitation training for stroke patients. The functional magnetic resonance (fMRI)-based MI BCI has high spatial resolution, which can be used to draw functional activation maps. The electroencephalogram (EEG)-based MI BCI has high temporal resolution, which is convenient for real-time BCI control. They can complement each other well. In this paper, we focus on EEG-based MI BCI for better combination. The identification of MI EEG signals is always quite challenging. Due to high inter-session/subject variability, each subject should spend long and tedious calibration time in collecting amounts of labeled samples for a subject-specific model. To tackle this problem, we propose a supervised selective cross-subject transfer learning (sSCSTL) approach which makes use of the labeled samples from target and source subjects based on Riemannian tangent space. Since the covariance matrices representing the multi-channel EEG signals belong to the smooth Riemannian manifold, we perform the Riemannian alignment to make the covariance matrices from different subjects close to each other. Then, all aligned covariance matrices are converted into the Riemannian tangent space features to train a classifier in the Euclidean space. Sequential forward floating search (SFFS) method is executed for source selection. To investigate the role of unlabeled samples, we further present semi-supervised and unsupervised versions which utilize the total samples and unlabeled samples from target subject, respectively. All our proposed algorithms transfer the labeled samples from most suitable source subjects into the feature space of target subject. Experimental results on two MI datasets demonstrated that our algorithms outperformed several state-of-the-art algorithms, especially for good target subjects.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    33
    References
    0
    Citations
    NaN
    KQI
    []