ToStaGAN: An end-to-end two-stage generative adversarial network for brain tumor segmentation

2021 
Abstract Brain tumor segmentation using MRI data remains challenging for some reasons. Hence, how to accurately segment the brain tumor is kept as a significant topic in the area of medical image segmentation. To follow the idea of “coarse-to-fine” in practice, this paper proposes an end-to-end two-stage generative adversarial neural network (ToStaGAN) to improve the braintumor is kept as a significant topic in the area of medical image segmentation. To follow the idea of “coarse-to-fine” in practice, this paper proposes an end-to-end two-stage generative adversarial neural network (ToStaGAN) to improve the brain tumor segmentation performance by making full use of semantic information in high-level.In the proposed ToStaGAN, the UNET network is adopted as the “coarse” generation network in the first stage, while a U-Shaped contextual autoencoder (ConEnDer), which consists of the proposed fine-grained extraction module (FEM) and dense skip connection, is proposed as the “fine” generation network in the second stage. To be more specific, the FEM is mainly used to effectively extract “fine-grained” features that plays a crucial role in semantic segmentation with only increasing a few parameters. Furthermore, it has the ability to obtain more diverse and abstract features. And by further utilizing the features extracted from the FEM and the coarse prediction map from the first stage, the ConEnDer holds the great potential to generate the optimized segmentation results. We then evaluate the performance of ToStaGAN on the BARATS 2015. Findings from the evaluations suggest that the proposed two-stage generation network achieves a better performance than the one-stage network; thus, also implying the effectiveness of the proposed ConEnder for “fine” segmentation. A comparative summary also shows that the ToStaGAN achieves a better segmentation performance in comparison to other competing approaches.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    52
    References
    2
    Citations
    NaN
    KQI
    []