|

TransU²-Net: An Effective Medical Image Segmentation Framework Based on Transformer and U²-Net.

Researchers

Journal

Modalities

Models

Abstract

In the past few years, U-Net based U-shaped architecture and skip-connections have made incredible progress in the field of medical image segmentation. U2-Net achieves good performance in computer vision. However, in the medical image segmentation task, U2-Net with over nesting is easy to overfit.A 2D network structure TransU2-Net combining transformer and a lighter weight U2-Net is proposed for automatic segmentation of brain tumor magnetic resonance image (MRI).The light-weight U2-Net architecture not only obtains multi-scale information but also reduces redundant feature extraction. Meanwhile, the transformer block embedded in the stacked convolutional layer obtains more global information; the transformer with skip-connection enhances spatial domain information representation. A new multi-scale feature map fusion strategy as a postprocessing method was proposed for better fusing high and low-dimensional spatial information.Our proposed model TransU2-Net achieves better segmentation results, on the BraTS2021 dataset, our method achieves an average dice coefficient of 88.17%; Evaluation on the publicly available MSD dataset, we perform tumor evaluation, we achieve a dice coefficient of 74.69%; in addition to comparing the TransU2-Net results are compared with previously proposed 2D segmentation methods.We propose an automatic medical image segmentation method combining transformers and U2-Net, which has good performance and is of clinical importance. The experimental results show that the proposed method outperforms other 2D medical image segmentation methods. Clinical Translation Statement: We use the BarTS2021 dataset and the MSD dataset which are publicly available databases. All experiments in this paper are in accordance with medical ethics.© 2023 The Authors.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *