|

Convolutional Neural Network for predicting thyroid cancer based on ultrasound elastography image of perinodular region.

Researchers

Journal

Modalities

Models

Abstract

We aimed to develop deep learning models based on perinodular regions’ SWE images and US images of thyroid nodules (TNs) and determine their performances in predicting thyroid cancer. A total of 1747 ACR-TIRADS 4 (TR4) TNs in 1582 patients were included in this retrospective study. US images, SWE images, and two quantitative SWE parameters (maximum elasticity of TNs; 5-point average maximum elasticity of TNs) were obtained. Based on US and SWE images of TNs and perinodular tissue, respectively, seven single-image CNN models (US, internal SWE, 0.5 mm SWE, 1.0 mm SWE, 1.5 mm SWE, 2.0 mm SWE of perinodular tissue, and whole SWE ROI image) and another six fusional-image CNN models (US + internal SWE, US + 0.5 mm SWE, US + 1.0 mm SWE, US + 1.5 mm SWE, US + 2.0 mm SWE, US + ROI SWE) were established using RestNet18. All of the CNN models and quantitative SWE parameters were built on a training cohort (1247 TNs) and evaluated on a validation cohort (500 TNs). In predicting thyroid cancer, US + 2.0 mm SWE image CNN model obtained the highest area under the curves (AUCs) in 10 mm < TNs ≤ 20 mm (0.95 for training; 0.92 for validation) and TNs > 20 mm (0.95 for training; 0.92 for validation), while US + 1.0 mm SWE image CNN model obtained the highest AUC in TNs ≤ 10 mm (0.95 for training; 0.92 for validation). The CNN models based on the fusion of SWE segmentation images and US image improve radiological diagnostic accuracy of thyroid cancer.© The Author(s) 2022. Published by Oxford University Press on behalf of Endocrine Society. All rights reserved. For permissions, please e-mail: [email protected].

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *