COVID-19 risk reduce based YOLOv4-P6-FaceMask detector and DeepSORT tracker.

Researchers

Journal

Modalities

Models

Abstract

Wearing masks in public areas is one of the effective protection methods for people. Although it is essential to wear the facemask correctly, there are few research studies about facemask detection and tracking based on image processing. In this work, we propose a new high performance two stage facemask detector and tracker with a monocular camera and a deep learning based framework for automating the task of facemask detection and tracking using video sequences. Furthermore, we propose a novel facemask detection dataset consisting of 18,000 images with more than 30,000 tight bounding boxes and annotations for three different class labels namely respectively: face masked/incorrectly masked/no masked. We based on Scaled-You Only Look Once (Scaled-YOLOv4) object detection model to train the YOLOv4-P6-FaceMask detector and Simple Online and Real-time Tracking with a deep association metric (DeepSORT) approach to tracking faces. We suggest using DeepSORT to track faces by ID assignment to save faces only once and create a database of no masked faces. YOLOv4-P6-FaceMask is a model with high accuracy that achieves 93% mean average precision, 92% mean average recall and the real-time speed of 35 fps on single GPU Tesla-T4 graphic card on our proposed dataset. To demonstrate the performance of the proposed model, we compare the detection and tracking results with other popular state-of-the-art models of facemask detection and tracking.© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022, Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *