|

Multimodality Image Registration in the Head-and-Neck using a Deep Learning Derived Synthetic CT as a Bridge.

Researchers

Journal

Modalities

Models

Abstract

To develop and demonstrate the efficacy of a novel head-and-neck multimodality image registration technique using deep-learning based cross-modality synthesis.
25 head-and-neck patients received MR and CT (CTaligned ) scans on the same day with the same immobilization. 5-fold cross validation was used with all of the MR-CT pairs to train a neural network to generate synthetic CTs from MR images. 24 of the 25 patients also had a separate CT without immobilization (CTnon-aligned ) and were used for testing. CTnon-aligned ‘s were deformed to the synthetic CT, and compared to CTnon-aligned registered to MR. The same registrations were performed from MR to CTnon-aligned and from synthetic CT to CTnon-aligned . All registrations used B-splines for modeling the deformation, and mutual information for the objective. Results were evaluated using the 95% Hausdorff distance among spinal cord contours, landmark error, inverse consistency, and Jacobian determinant of the estimated deformation fields.
When large initial rigid misalignment is present, registering CT to MRI-derived synthetic CT aligns the cord better than a direct registration. The average landmark error decreased from 9.8±3.1mm in MR→CTnon-aligned to 6.0±2.1mm in CTsynth →CTnon-aligned deformable registrations. In the CT to MR direction, the landmark error decreased from 10.0±4.3mm in CTnon-aligned →MR deformable registrations to 6.6±2.0 mm in CTnon-aligned → CTsynth deformable registrations. The Jacobian determinant had an average value of 0.98. The proposed method also demonstrated improved inverse consistency over the direct method.
We showed that using a deep learning derived synthetic CT in lieu of an MR for MR→CT and CT→MR deformable registration offers superior results to direct multimodal registration.
© 2019 American Association of Physicists in Medicine.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *