Analog Spiking U-Net integrating CBAM&ViT for medical image segmentation.

Researchers

Journal

Modalities

Models

Abstract

SNNs are gaining popularity in AI research as a low-power alternative in deep learning due to their sparse properties and biological interpretability. Using SNNs for dense prediction tasks is becoming an important research area. In this paper, we firstly proposed a novel modification on the conventional Spiking U-Net architecture by adjusting the firing positions of neurons. The modified network model, named Analog Spiking U-Net (AS U-Net), is capable of incorporating the Convolutional Block Attention Module (CBAM) into the domain of SNNs. This is the first successful implementation of CBAM in SNNs, which has the potential to improve SNN model’s segmentation performance while decreasing information loss. Then, the proposed AS U-Net (with CBAM&ViT) is trained by direct encoding on a comprehensive dataset obtained by merging several diabetic retinal vessel segmentation datasets. Based on the experimental results, the provided SNN model achieves the highest segmentation accuracy in retinal vessel segmentation for diabetes mellitus, surpassing other SNN-based models and most ANN-based related models. In addition, under the same structure, our model demonstrates comparable performance to the ANN model. And then, the novel model achieves state-of-the-art(SOTA) results in comparative experiments when both accuracy and energy consumption are considered (Fig. 1). At the same time, the ablative analysis of CBAM further confirms its feasibility and effectiveness in SNNs, which means that a novel approach could be provided for subsequent deployment and hardware chip application. In the end, we conduct extensive generalization experiments on the same type of segmentation task (ISBI and ISIC), the more complex multi-segmentation task (Synapse), and a series of image generation tasks (MNIST, Day2night, Maps, Facades) in order to visually demonstrate the generality of the proposed method.Copyright © 2024 Elsevier Ltd. All rights reserved.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *