Multi-head GAGNN: A Multi-head Guided Attention Graph Neural Network for Modeling Spatio-temporal Patterns of Holistic Brain Functional Networks.

2021 
It has been widely demonstrated that complex brain function is mediated by the interaction of multiple concurrent brain functional networks, each of which is spatially distributed across specific brain regions in a temporally dynamic fashion. Therefore, modeling spatio-temporal patterns of those holistic brain functional networks provides a foundation for understanding the brain. Compared to conventional modeling approaches such as correlation, general linear model, and matrix decomposition methods, recent deep learning methodologies have shown a superior performance. However, the existing deep learning models either underutilized both spatial and temporal characteristics of fMRI during model training, or merely focused on modeling only one targeted brain functional network at a time while ignoring holistic ones, resulting in a significant gap in our current understanding of how the brain functions. To bridge this gap, we propose a novel Multi-Head Guided Attention Graph Neural Network (Multi-Head GAGNN) to simultaneously model spatio-temporal patterns of multiple brain functional networks. In Multi-Head GAGNN, the spatial patterns of multiple brain networks are firstly modeled in a multi-head attention graph U-net, and then adopted as guidance for modeling the corresponding temporal patterns of multiple brain networks in a temporal multi-head guided attention network model. Results based on two task fMRI datasets from the public Human Connectome Project demonstrate superior ability and generalizability of Multi-Head GAGNN in simultaneously modeling spatio-temporal patterns of holistic brain functional networks compared to other state-of-the-art models. This study offers a new and powerful tool for helping understand complex brain function.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    22
    References
    0
    Citations
    NaN
    KQI
    []