| |

CT2US: Cross-modal transfer learning for kidney segmentation in ultrasound images with synthesized data.

Researchers

Journal

Modalities

Models

Abstract

Accurate segmentation of kidney in ultrasound images is a vital procedure in clinical diagnosis and interventional operation. In recent years, deep learning technology has demonstrated promising prospects in medical image analysis. However, due to the inherent problems of ultrasound images, data with annotations are scarce and arduous to acquire, hampering the application of data-hungry deep learning methods. In this paper, we propose cross-modal transfer learning from computerized tomography (CT) to ultrasound (US) by leveraging annotated data in the CT modality. In particular, we adopt cycle generative adversarial network (CycleGAN) to synthesize US images from CT data and construct a transition dataset to mitigate the immense domain discrepancy between US and CT. Mainstream convolutional neural networks such as U-Net, U-Res, PSPNet, and DeepLab v3+ are pretrained on the transition dataset and then transferred to real US images. We first trained CNN models on a data set composed of 50 ultrasound images and validated them on a validation set composed of 30 ultrasound images. In addition, we selected 82 ultrasound images from another hospital to construct a cross-site data set to verify the generalization performance of the models. The experimental results show that with our proposed transfer learning strategy, the segmentation accuracy in dice similarity coefficient (DSC) reaches 0.853 for U-Net, 0.850 for U-Res, 0.826 for PSPNet and 0.827 for DeepLab v3+ on the cross-site test set. Compared with training from scratch, the accuracy improvement was 0.127, 0.097, 0.105 and 0.036 respectively. Our transfer learning strategy effectively improves the accuracy and generalization ability of ultrasound image segmentation model with limited training data.Copyright © 2022 Elsevier B.V. All rights reserved.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *