Weakly non-rigid MR-TRUS prostate registration using fully convolutional and recurrent neural networks

2020 
In this study, we propose a new deep learning (DL) framework, a combination of fully convolutional and recurrent neural networks, and integrate them with a weakly supervised method for 3D MRI-transrectal ultrasound (TRUS) image registration. The MR and TRUS images are often highly anisotropic in its dimensions. For instance, in 3D US images the scale of each voxel in depth is often 5~10 times larger than that in each image slice. This high anisotropy makes the common 3D isotropic kernel suffer poor generality, resulting in unsatisfactory registration results in terms of accuracy. The key idea of the paper is to explicitly leverage 3D image anisotropy through the exploitation of the intra-slice context with a fully convolutional network (FCN) and the utilization of the inter-slice context with a recurrent neural network (RNN). After the 3D hierarchical features in MRI and TRUS have been extracted, we generate the dense deformation field by aligning corresponding prostate labels for individual image pairs. Experimental results showed that our proposed FCN-RNN neural network produces a mean target registration error (TRE) of 2.77±1.40 mm, and a mean dice similarity coefficient (DSC) of 0.9.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    2
    Citations
    NaN
    KQI
    []