|

Bidirectional feature matching based on deep pairwise contrastive learning for multiparametric MRI image synthesis.

Researchers

Journal

Modalities

Models

Abstract

Multi-parametric MR image synthesis is an effective approach for several clinical applications where specific modalities may be unavailable to reach a diagnosis. While technical and practical conditions limit the acquisition of new modalities for a patient, multimodal image synthesis combines multiple modalities to synthesize the desired modality. 
Approach. In this paper, we propose a new multi-parametric MRI synthesis model, which generates the target MRI modality from two other available modalities, in pathological MR images. We first adopt a contrastive learning approach that trains an encoder network to extract a suitable feature representation of the target space. Secondly, we build a synthesis network that generates the target image from a common feature space that approximately matches the contrastive learned space of the target modality. We incorporate a bidirectional feature learning strategy that learns a multimodal feature matching function, in two opposite directions, to transform the augmented multichannel input in the learned target space. Overall, our training synthesis loss is expressed as the combination of the reconstruction loss and a bidirectional triplet loss, using a pair of features.
Main results. Compared to other state-of-the-art methods, the proposed model achieved an average improvement rate of $3.9\%$ and $3.6\%$ on the IXI and BraTS’18 datasets respectively. On the tumor BraTS’18 dataset, our model records the highest Dice score of $0.793 (0.04)$ for preserving the synthesized tumor regions in the segmented images.The proposed model will be useful for MR-guided neurosurgery, where restricted MR modalities are acquired during the process. A clinical validation will be performed on segmented tumor areas of (sMR) during MR-guided neurosurgery.Creative Commons Attribution license.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *