|

Noise reduction by multiple path neural network using Attention mechanisms with an emphasis on robustness against Errors: A pilot study on brain Diffusion-Weighted images.

Researchers

Journal

Modalities

Models

Abstract

In deep learning-based noise reduction, larger networks offer advanced and complex functionality by utilizing its greater degree of freedom, but come with increased unpredictability, raising the potential risk of unforeseen errors. Here, we introduce a novel denoising model for diffusion-weighted images that intentionally limits the network output freedom by incorporating multiple pathways with varying degrees of freedom, with the aim of minimizing the chance of unintended alterations to the input. The purpose of this pilot study is to assess the model’s ability to perform effective denoising under the constraints.Images from 10 healthy volunteers were used. Key innovations in our model development include: (1) neural network architecture that separated the function for calculating the specific output values from the function for adjusting the calculation for each pixel and (2) training that optimised the network based on both image and secondary obtained diffusion tensor. The generated images were compared with the original ones by measuring the deviation from ground truth images (averaged across eight acquisitions).The generated images demonstrated closer alignment with the ground truth images, both visually and statistically (Q < 0.05), compared to the original images. Furthermore, the advantage of the generated images over the original images was also found in the secondary obtained quantitative parameter maps with significance (Q < 0.05).The usefulness of the proposed method was suggested because it was successful in improving both the quality of the generated images and accuracy of the major diffusion parameter maps under the given restrictions.Copyright © 2023. Published by Elsevier Ltd.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *