|

RVCNet: A hybrid deep neural network framework for the diagnosis of lung diseases.

Researchers

Journal

Modalities

Models

Abstract

Early evaluation and diagnosis can significantly reduce the life-threatening nature of lung diseases. Computer-aided diagnostic systems (CADs) can help radiologists make more precise diagnoses and reduce misinterpretations in lung disease diagnosis. Existing literature indicates that more research is needed to correctly classify lung diseases in the presence of multiple classes for different radiographic imaging datasets. As a result, this paper proposes RVCNet, a hybrid deep neural network framework for predicting lung diseases from an X-ray dataset of multiple classes. This framework is developed based on the ideas of three deep learning techniques: ResNet101V2, VGG19, and a basic CNN model. In the feature extraction phase of this new hybrid architecture, hyperparameter fine-tuning is used. Additional layers, such as batch normalization, dropout, and a few dense layers, are applied in the classification phase. The proposed method is applied to a dataset of COVID-19, non-COVID lung infections, viral pneumonia, and normal patients’ X-ray images. The experiments take into account 2262 training and 252 testing images. Results show that with the Nadam optimizer, the proposed algorithm has an overall classification accuracy, AUC, precision, recall, and F1-score of 91.27%, 92.31%, 90.48%, 98.30%, and 94.23%, respectively. Finally, these results are compared with some recent deep-learning models. For this four-class dataset, the proposed RVCNet has a classification accuracy of 91.27%, which is better than ResNet101V2, VGG19, VGG19 over CNN, and other stand-alone models. Finally, the application of the GRAD-CAM approach clearly interprets the classification of images by the RVCNet framework.Copyright: © 2023 Alam et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *