| |

Lightweight deep learning model for underwater waste segmentation based on sonar images.

Researchers

Journal

Modalities

Models

Abstract

In recent years, the rapid accumulation of marine waste not only endangers the ecological environment but also causes seawater pollution. Traditional manual salvage methods often have low efficiency and pose safety risks to human operators, making automatic underwater waste recycling a mainstream approach. In this paper, we propose a lightweight multi-scale cross-level network for underwater waste segmentation based on sonar images that provides pixel-level location information and waste categories for autonomous underwater robots. In particular, we introduce hybrid perception and multi-scale attention modules to capture multi-scale contextual features and enhance high-level critical information, respectively. At the same time, we use sampling attention modules and cross-level interaction modules to achieve feature down-sampling and fuse detailed features and semantic features, respectively. Relevant experimental results indicate that our method outperforms other semantic segmentation models and achieves 74.66 % mIoU with only 0.68 M parameters. In particular, compared with the representative PIDNet Small model based on the convolutional neural network architecture, our method can improve the mIoU metric by 1.15 percentage points and can reduce model parameters by approximately 91 %. Compared with the representative SeaFormer T model based on the transformer architecture, our approach can improve the mIoU metric by 2.07 percentage points and can reduce model parameters by approximately 59 %. Our approach maintains a satisfactory balance between model parameters and segmentation performance. Our solution provides new insights into intelligent underwater waste recycling, which helps in promoting sustainable marine development.Copyright © 2024 Elsevier Ltd. All rights reserved.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *