|

Indoor localization of hand-held OCT probe using visual odometry and real-time segmentation using deep learning.

Researchers

Journal

Modalities

Models

Abstract

Optical coherence tomography (OCT) is an established medical imaging modality that has found widespread use due to its ability to visualize tissue structures at a high resolution. Currently, OCT hand-held imaging probes lack positional information, making it difficult or even impossible to link a specific image to the location it was originally obtained. In this study, we propose a camera-based localization method to track and record the scanner position in real-time, as well as providing a deep learning-based segmentation method.We used camera-based visual odometry (VO) and simultaneous mapping and localization (SLAM) to compute and visualize the location of a hand-held OCT imaging probe. A deep convolutional neural network (CNN) was used for kidney tubule lumens segmentation.The mean absolute error (MAE) and the standard deviation (STD) for 1D translation were found to be 0.15 mm and 0.26mm respectively. For 2D translation, the MAE and STD were found to be 0.85 mm and 0.50 mm, respectively. The dice coefficient of the segmentation method was 0.7. The p-values of the t-test between predicted and actual average density and predicted and actual average diameter were both 0.9999. We also experimented on a preserved kidney utilizing our localization method with automatic segmentation. Comparisons of the average density maps and average diameter maps were made between the 3D comprehensive scan and VO system scan.Our results demonstrate that VO can track the probe location at high accuracy, and provides a user-friendly visualization tool to review OCT 2D images in 3D space. It also indicates that deep learning can provide high accuracy and high speed for segmentation.The proposed methods can be potentially used to predict delayed graft function (DGF) in kidney transplantation.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *