| |

68 landmarks are efficient for 3D face alignment: what about more?: 3D face alignment method applied to face recognition.

Researchers

Journal

Modalities

Models

Abstract

This paper proposes a 3D face alignment of 2D face images in the wild with noisy landmarks. The objective is to recognize individuals from their single profile image. We first proceed by extracting more than 68 landmarks using a bag of features. This allows us to obtain a bag of visible and invisible facial keypoints. Then, we reconstruct a 3D face model and get a triangular mesh by meshing the obtained keypoints. For each face, the number of keypoints is not the same, which makes this step very challenging. Later, we process the 3D face using butterfly and BPA algorithms to make correlation and regularity between 3D face regions. Indeed, 2D-to-3D annotations give much higher quality to the 3D reconstructed face model without the need for any additional 3D Morphable models. Finally, we carry out alignment and pose correction steps to get frontal pose by fitting the rendered 3D reconstructed face to 2D face and performing pose normalization to achieve good rates in face recognition. The recognition step is based on deep learning and it is performed using DCNNs, which are very powerful and modern, for feature learning and face identification. To verify the proposed method, three popular benchmarks, YTF, LFW, and BIWI databases, are tested. Compared to the best recognition results reported on these benchmarks, our proposed method achieves comparable or even better recognition performances.© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023, Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *