| |

Attention Feature Fusion Network via Knowledge Propagation for Automated Respiratory Sound Classification.

Researchers

Journal

Modalities

Models

Abstract

Goal: In light of the COVID-19 pandemic, the early diagnosis of respiratory diseases has become increasingly crucial. Traditional diagnostic methods such as computed tomography (CT) and magnetic resonance imaging (MRI), while accurate, often face accessibility challenges. Lung auscultation, a simpler alternative, is subjective and highly dependent on the clinician’s expertise. The pandemic has further exacerbated these challenges by restricting face-to-face consultations. This study aims to overcome these limitations by developing an automated respiratory sound classification system using deep learning, facilitating remote and accurate diagnoses. Methods: We developed a deep convolutional neural network (CNN) model that utilizes spectrographic representations of respiratory sounds within an image classification framework. Our model is enhanced with attention feature fusion of low-to-high-level information based on a knowledge propagation mechanism to increase classification effectiveness. This novel approach was evaluated using the ICBHI benchmark dataset and a larger, self-collected Pediatric dataset comprising outpatient children aged 1 to 6 years. Results: The proposed CNN model with knowledge propagation demonstrated superior performance compared to existing state-of-the-art models. Specifically, our model showed higher sensitivity in detecting abnormalities in the Pediatric dataset, indicating its potential for improving the accuracy of respiratory disease diagnosis. Conclusions: The integration of a knowledge propagation mechanism into a CNN model marks a significant advancement in the field of automated diagnosis of respiratory disease. This study paves the way for more accessible and precise healthcare solutions, which is especially crucial in pandemic scenarios.© 2024 The Authors.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *