HAPPILEE: The Harvard Automated Processing Pipeline In Low Electrode Electroencephalography, a standardized software for low density EEG and ERP data

2021 
Low-density Electroencephalography (EEG) recordings (e.g. fewer than 32 electrodes) are widely-used in research and clinical practice and enable scalable brain function measurement across a variety of settings and populations. Though a number of automated pipelines have recently been proposed to standardize and optimize EEG preprocessing for high-density systems with state-of-the-art methods, few solutions have emerged that are compatible with low-density systems. However, low-density data often include long recording times and/or large sample sizes that would benefit from similar standardization and automation with contemporary methods. To address this need, we propose the HAPPE In Low Electrode Electroencephalography (HAPPILEE) pipeline as a standardized, automated pipeline optimized for EEG recordings with low density channel layouts of any size. HAPPILEE processes task-free (e.g. resting-state) and task-related EEG, and event-related potential (ERP) data, from raw files through a series of processing steps including filtering, line noise reduction, bad channel detection, artifact rejection from continuous data, segmentation, and bad segment rejection that have all been optimized for low density data. HAPPILEE also includes post-processing reports of data and pipeline quality metrics to facilitate the evaluation and reporting of data quality and processing-related changes to the data in a standardized manner. We describe multiple approaches with both recorded and simulated EEG data to optimize and validate pipeline performance. The HAPPILEE pipeline is freely available as part of HAPPE 2.0 software under the terms of the GNU General Public License at: https://github.com/PINE-Lab/HAPPE.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    61
    References
    0
    Citations
    NaN
    KQI
    []