IMoVR-Net: A robust interpretable network for multi-ocular lesion recognition from TAO facial images.

Researchers

Journal

Modalities

Models

Abstract

Thyroid-associated ophthalmopathy (TAO) is an organ-specific autoimmune disease that seriously affects patient’s life and health. However, the early diagnosis of TAO is highly dependent on the physician’s subjective experience. Moreover, the currently proposed deep learning networks for eye diseases do not provide robust interpretability concerning feature learning paradigm, model structure, and the number of neurons. But the mentioned components are very important for model interpretability and are key factors that severely affect the transparency of the model. Therefore, a robust interpretable multi-orientation visual recognition network (IMoVR-Net) for TAO multi-ocular lesion recognition is proposed in this paper. Firstly, a multi-orientation visual cascaded encoder composed of the DenseGabor module and the dilated Gabor convolution group is proposed to achieve the fine extraction of multi-directional TAO lesion features by using a novel feature learning paradigm called alternating filtering. Besides, combining information theory and topology tool, an optimization rule based on topological energy entropy is proposed to provide a solid interpretable theory for determining the model structure. Finally, a clustering correlation analysis method is developed to accomplish the determination of the number of convolutional hidden neurons, providing robust interpretability for the selection of the number of neurons. Compared to other advanced models, the IMoVR-Net achieved state-of-the-art performance on different TAO ocular datasets with an average accuracy, sensitivity, precision, and F1 score of 0.878, 87.27 %, 0.875, and 87.39 %, respectively. The IMoVR-Net has good clinical application prospects due to its strong recognition ability and robust interpretability in feature extraction paradigm, model structure, and number of convolutional neurons.Copyright © 2023 Elsevier Ltd. All rights reserved.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *