3D Vessel Segmentation With Limited Guidance of 2D Structure-Agnostic Vessel Annotations.

Researchers

Journal

Modalities

Models

Abstract

Delineating 3D blood vessels of various anatomical structures is essential for clinical diagnosis and treatment, however, is challenging due to complex structure variations and varied imaging conditions. Although recent supervised deep learning models have demonstrated their superior capacity in automatic 3D vessel segmentation, the reliance on expensive 3D manual annotations and limited capacity for annotation reuse among different vascular structures hinder their clinical applications. To avoid the repetitive and costly annotating process for each vascular structure and make full use of existing annotations, this paper proposes a novel 3D shape-guided local discrimination (3D-SLD) model for 3D vascular segmentation under limited guidance from public 2D vessel annotations. The primary hypothesis is that 3D vessels are composed of semantically similar voxels and often exhibit tree-shaped morphology. Accordingly, the 3D region discrimination loss is firstly proposed to learn the discriminative representation measuring voxel-wise similarities and cluster semantically consistent voxels to form the candidate 3D vascular segmentation in unlabeled images. Secondly, the shape distribution from existing 2D structure-agnostic vessel annotations is introduced to guide the 3D vessels with the tree-shaped morphology by the adversarial shape constraint loss. Thirdly, to enhance training stability and prediction credibility, the highlighting-reviewing-summarizing (HRS) mechanism is proposed. This mechanism involves summarizing historical models to maintain temporal consistency and identifying credible pseudo labels as reliable supervision signals. Only guided by public 2D coronary artery annotations, our method achieves results comparable to SOTA barely-supervised methods in 3D cerebrovascular segmentation, and the best DSC in 3D hepatic vessel segmentation, demonstrating the effectiveness of our method.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *