|

Diagnostic performance of deep learning-based automatic white matter hyperintensity segmentation for classification of the Fazekas scale and differentiation of subcortical vascular dementia.

Researchers

Journal

Modalities

Models

Abstract

To validate the diagnostic performance of commercially available, deep learning-based automatic white matter hyperintensity (WMH) segmentation algorithm for classifying the grades of the Fazekas scale and differentiating subcortical vascular dementia.This retrospective, observational, single-institution study investigated the diagnostic performance of a deep learning-based automatic WMH volume segmentation to classify the grades of the Fazekas scale and differentiate subcortical vascular dementia. The VUNO Med-DeepBrain was used for the WMH segmentation system. The system for segmentation of WMH was designed with convolutional neural networks, in which the input image was comprised of a pre-processed axial FLAIR image, and the output was a segmented WMH mask and its volume. Patients presented with memory complaint between March 2017 and June 2018 were included and were split into training (March 2017-March 2018, n = 596) and internal validation test set (April 2018-June 2018, n = 204).Optimal cut-off values to categorize WMH volume as normal vs. mild/moderate/severe, normal/mild vs. moderate/severe, and normal/mild/moderate vs. severe were 3.4 mL, 9.6 mL, and 17.1 mL, respectively, and the AUC were 0.921, 0.956 and 0.960, respectively. When differentiating normal/mild vs. moderate/severe using WMH volume in the test set, sensitivity, specificity, and accuracy were 96.4%, 89.9%, and 91.7%, respectively. For distinguishing subcortical vascular dementia from others using WMH volume, sensitivity, specificity, and accuracy were 83.3%, 84.3%, and 84.3%, respectively.Deep learning-based automatic WMH segmentation may be an accurate and promising method for classifying the grades of the Fazekas scale and differentiating subcortical vascular dementia.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *