| |

GCDN-Net: Garbage classifier deep neural network for recyclable urban waste management.

Researchers

Journal

Modalities

Models

Abstract

The escalating waste volume due to urbanization and population growth has underscored the need for advanced waste sorting and recycling methods to ensure sustainable waste management. Deep learning models, adept at image recognition tasks, offer potential solutions for waste sorting applications. These models, trained on extensive waste image datasets, possess the ability to discern unique features of diverse waste types. Automating waste sorting hinges on robust deep learning models capable of accurately categorizing a wide range of waste types. In this study, a multi-stage machine learning approach is proposed to classify different waste categories using the “Garbage In, Garbage Out” (GIGO) dataset of 25,000 images. The novel Garbage Classifier Deep Neural Network (GCDN-Net) is introduced as a comprehensive solution, adept in both single-label and multi-label classification tasks. Single-label classification distinguishes between garbage and non-garbage images, while multi-label classification identifies distinct garbage categories within single or multiple images. The performance of GCDN-Net is rigorously evaluated and compared against state-of-the-art waste classification methods. Results demonstrate GCDN-Net’s excellence, achieving 95.77% accuracy, 95.78% precision, 95.77% recall, 95.77% F1-score, and 95.54% specificity when classifying waste images, outperforming existing models in single-label classification. In multi-label classification, GCDN-Net attains an overall Mean Average Precision (mAP) of 0.69 and an F1-score of 75.01%. The reliability of network performance is affirmed through saliency map-based visualization generated by Score-CAM (class activation mapping). In conclusion, deep learning-based models exhibit efficacy in categorizing diverse waste types, paving the way for automated waste sorting and recycling systems that can mitigate costs and processing times.Copyright © 2023 Elsevier Ltd. All rights reserved.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *