A devised thyroid segmentation with multi-stage modification based on Super-pixel U-Net under insufficient data.

Researchers

Journal

Modalities

Models

Abstract

The application of deep learning to medical image segmentation has received considerable attention. Nevertheless, when segmenting thyroid ultrasound images, it is difficult to achieve good segmentation results using deep learning methods because of the large number of nonthyroidal regions and insufficient training data.In this study, a Super-pixel U-Net, designed by adding a supplementary path to U-Net, was devised to boost the segmentation results of thyroids. The improved network can introduce more information into the network, boosting auxiliary segmentation results. A multi-stage modification is introduced in this method, which includes boundary segmentation, boundary repair, and auxiliary segmentation. To reduce the negative effects of non-thyroid regions in the segmentation, U-Net was utilized to obtain rough boundary outputs. Subsequently, another U-Net is trained to improve and repair the coverage of the boundary outputs. Super-pixel U-Net was applied in the third stage to assist in the segmentation of the thyroid more precisely. Finally, multidimensional indicators were used to compare the segmentation results of the proposed method with those of other comparison experiments.The proposed method achieved an F1 Score of 0.9161 and an IoU of 0.9279. Furthermore, the proposed method also exhibits better performance in terms of shape similarity, with an average convexity of 0.9395. an average ratio of 0.9109, an average compactness of 0.8976, an average eccentricity of 0.9448, and an average rectangularity of 0.9289. The average area estimation indicator was 0.8857.The proposed method exhibited superior performance, proving the improvements of the multi-stage modification and Super-pixel U-Net.Copyright © 2023 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *