|

Deep-learning-aided forward optical coherence tomography endoscope for percutaneous nephrostomy guidance.

Researchers

Journal

Modalities

Models

Abstract

Percutaneous renal access is the critical initial step in many medical settings. In order to obtain the best surgical outcome with minimum patient morbidity, an improved method for access to the renal calyx is needed. In our study, we built a forward-view optical coherence tomography (OCT) endoscopic system for percutaneous nephrostomy (PCN) guidance. Porcine kidneys were imaged in our experiment to demonstrate the feasibility of the imaging system. Three tissue types of porcine kidneys (renal cortex, medulla, and calyx) can be clearly distinguished due to the morphological and tissue differences from the OCT endoscopic images. To further improve the guidance efficacy and reduce the learning burden of the clinical doctors, a deep-learning-based computer aided diagnosis platform was developed to automatically classify the OCT images by the renal tissue types. Convolutional neural networks (CNN) were developed with labeled OCT images based on the ResNet34, MobileNetv2 and ResNet50 architectures. Nested cross-validation and testing was used to benchmark the classification performance with uncertainty quantification over 10 kidneys, which demonstrated robust performance over substantial biological variability among kidneys. ResNet50-based CNN models achieved an average classification accuracy of 82.6%±3.0%. The classification precisions were 79%±4% for cortex, 85%±6% for medulla, and 91%±5% for calyx and the classification recalls were 68%±11% for cortex, 91%±4% for medulla, and 89%±3% for calyx. Interpretation of the CNN predictions showed the discriminative characteristics in the OCT images of the three renal tissue types. The results validated the technical feasibility of using this novel imaging platform to automatically recognize the images of renal tissue structures ahead of the PCN needle in PCN surgery.
© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *