Self-derived organ attention for unpaired CT-MRI deep domain adaptation based MRI segmentation.

Researchers

Journal

Modalities

Models

Abstract

To develop and evaluate a deep learning method to segment parotid glands from MRI using unannotated MRI and unpaired expert-segmented CT datasets. We introduced a new self-derived organ attention deep learning network for combined CT to MRI image-to-image translation (I2I) and MRI segmentation, all trained as an end-to-end network. The expert segmentations available on CT scans were combined with the I2I translated pseudo MR images to train the MRI segmentation network. Once trained, the MRI segmentation network alone is required. We introduced an organ attention discriminator that constrains the CT to MR generator to synthesize pseudo MR images that preserve organ geometry and appearance statistics as in real MRI. The I2I translation network training was regularized using the organ attention discriminator, global image-matching discriminator, and cycle consistency losses. MRI segmentation training was regularized by using cross-entropy loss. Segmentation performance was compared against multiple domain adaptation-based segmentation methods using the Dice similarity coefficient (DSC) and Hausdorff distance at the 95th percentile (HD95). All networks were trained using 85 unlabeled T2-weighted fat suppressed (T2wFS) MRIs and 96 expert-segmented CT scans. Performance upper-limit was based on fully supervised MRI training done using the 85 T2wFS MRI with expert segmentations. Independent evaluation was performed on 77 MRIs never used in training. The proposed approach achieved the highest accuracy (left parotid: DSC 0.82 ± 0.03, HD95 2.98 ± 1.01 mm; right parotid: 0.81 ± 0.05, HD95 3.14 ± 1.17 mm) compared to other methods. This accuracy was close to the reference fully supervised MRI segmentation (DSC of 0.84 ± 0.04, a HD95 of 2.24 ± 0.77 mm for the left parotid, and a DSC of 0.84 ± 0.06 and HD95 of 2.32 ± 1.37 mm for the right parotid glands).

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *