|

P-ResUnet: Segmentation of brain tissue with Purified Residual Unet.

Researchers

Journal

Modalities

Models

Abstract

Brain tissue of Magnetic Resonance Imaging is precisely segmented and quantified, which aids in the diagnosis of neurological diseases such as epilepsy, Alzheimer’s, and multiple sclerosis. Recently, UNet-like architectures are widely used for medical image segmentation, which achieved promising performance by using the skip connection to fuse the low-level and high-level information. However, In the process of integrating the low-level and high-level information, the non-object information (noise) will be added, which reduces the accuracy of medical image segmentation. Likewise, the same problem also exists in the residual unit. Since the output and input of the residual unit are fused, the non-object information (noise) of the input of the residual unit will be in the integration. To address this challenging problem, in this paper we propose a Purified Residual U-net for the segmentation of brain tissue. This model encodes the image to obtain deep semantic information and purifies the information of low-level features and the residual unit from the image, and acquires the result through a decoder at last. We use the Dilated Pyramid Separate Block (DPSB) as the first block to purify the features for each layer in the encoder without the first layer, which expands the receptive field of the convolution kernel with only a few parameters added. In the first layer, we have explored the best performance achieved with DPB. We find the most non-object information (noise) in the initial image, so it is good for the accuracy to exchange the information to the max degree. We have conducted experiments with the widely used IBSR-18 dataset composed of T-1 weighted MRI volumes from 18 subjects. The results show that compared with some of the cutting-edge methods, our method enhances segmentation performance with the mean dice score reaching 91.093% and the mean Hausdorff distance decreasing to 3.2606.Copyright © 2022 Elsevier Ltd. All rights reserved.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *