|

A coarse-to-fine cascade deep learning neural network for segmenting cerebral aneurysms in time-of-flight magnetic resonance angiography.

Researchers

Journal

Modalities

Models

Abstract

Accurate segmentation of unruptured cerebral aneurysms (UCAs) is essential to treatment planning and rupture risk assessment. Currently, three-dimensional time-of-flight magnetic resonance angiography (3D TOF-MRA) has been the most commonly used method for screening aneurysms due to its noninvasiveness. The methods based on deep learning technologies can assist radiologists in achieving accurate and reliable analysis of the size and shape of aneurysms, which may be helpful in rupture risk prediction models. However, the existing methods did not accomplish accurate segmentation of cerebral aneurysms in 3D TOF-MRA.This paper proposed a CCDU-Net for segmenting UCAs of 3D TOF-MRA images. The CCDU-Net was a cascade of a convolutional neural network for coarse segmentation and the proposed DU-Net for fine segmentation. Especially, the dual-channel inputs of DU-Net were composed of the vessel image and its contour image which can augment the vascular morphological information. Furthermore, a newly designed weighted loss function was used in the training process of DU-Net to promote the segmentation performance.A total of 270 patients with UCAs were enrolled in this study. The images were divided into the training (N = 174), validation (N = 43), and testing (N = 53) cohorts. The CCDU-Net achieved a dice similarity coefficient (DSC) of 0.616 ± 0.167, Hausdorff distance (HD) of 5.686 ± 7.020 mm, and volumetric similarity (VS) of 0.752 ± 0.226 in the testing cohort. Compared with the existing best method, the DSC and VS increased by 18% and 5%, respectively, while the HD decreased by one-tenth.We proposed a CCDU-Net for segmenting UCAs in 3D TOF-MRA, and the obtained results show that the proposed method outperformed other existing methods.© 2022. The Author(s).

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *