Automated 3D Perioral Landmark Detection Using High-Resolution Network: Artificial Intelligence-Based Anthropometric Analysis.
Researchers
Journal
Modalities
Models
Abstract
3D facial stereophotogrammetry, as a convenient, non-invasive and highly reliable evaluation tool, has shown great potential in pre-operative planning and treatment efficacy evaluation of plastic surgery in recent years. However, it requires manual identification of facial landmarks by trained evaluators to obtain anthropometric data, which consumes large amount of time and effort. Automatic 3D facial landmark localization may facilitate fast data acquisition and eliminate evaluator error.In this paper, we propose a novel deep-learning method based on dimension-transformation and key-point detection for automated 3D perioral landmark annotation.The 3D facial model is transformed into 2D images on which High-Resolution Network is implemented for key point detection. The 2D coordinates of key points are then mapped back to the 3D model using mathematical methods to obtain the 3D landmark coordinates. This program was trained with 120 facial models and validated in 50 facial models.Our approach achieved satisfactory accuracy of 1.30 ± 0.68 mm error in landmark detection with an average processing time of 5.2 ± 0.21 seconds per model. And subsequent analysis based on these landmarks showed an error of 0.87 ± 1.02 mm for linear measurements and 5.62 ± 6.61° for angular measurements.This automated 3D perioral landmarking method could serve as an effective tool that enables fast and accurate anthropometric analysis of lip morphology for plastic surgery and aesthetic procedures.© The Author(s) 2024. Published by Oxford University Press on behalf of The Aesthetic Society. All rights reserved. For commercial re-use, please contact [email protected] for reprints and translation rights for reprints. All other permissions can be obtained through our RightsLink service via the Permissions link on the article page on our site—for further information please contact [email protected].