| |

Deep-learning segmentation of ultrasound images for automated calculation of the hydronephrosis area to renal parenchyma ratio.

Researchers

Journal

Modalities

Models

Abstract

We investigated the feasibility of measuring the hydronephrosis area to renal parenchyma (HARP) ratio from ultrasound images using a deep-learning network.The coronal renal ultrasound images of 195 pediatric and adolescent patients who underwent pyeloplasty to repair ureteropelvic junction obstruction were retrospectively reviewed. After excluding cases without a representative longitudinal renal image, we used a dataset of 168 images for deep-learning segmentation. Ten novel networks, such as combinations of DeepLabV3+ and UNet++, were assessed for their ability to calculate hydronephrosis and kidney areas, and the ensemble method was applied for further improvement. By dividing the image set into four, cross-validation was conducted, and the segmentation performance of the deep-learning network was evaluated using sensitivity, specificity, and dice similarity coefficients by comparison with the manually traced area.All 10 networks and ensemble methods showed good visual correlation with the manually traced kidney and hydronephrosis areas. The dice similarity coefficient of the 10-model ensemble was 0.9108 on average, and the best 5-model ensemble had a dice similarity coefficient of 0.9113 on average. We included patients with severe hydronephrosis who underwent renal ultrasonography at a single institution; thus, external validation of our algorithm in a heterogeneous ultrasonography examination setup with a diverse set of instruments is recommended.Deep-learning-based calculation of the HARP ratio is feasible and showed high accuracy for imaging of the severity of hydronephrosis using ultrasonography. This algorithm can help physicians make more accurate and reproducible diagnoses of hydronephrosis using ultrasonography.© The Korean Urological Association.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *