| |

Improving Augmented Reality Through Deep Learning: Real-time Instrument Delineation in Robotic Renal Surgery.

Researchers

Journal

Modalities

Models

Abstract

Several barriers prevent the integration and adoption of augmented reality (AR) in robotic renal surgery despite the increased availability of virtual three-dimensional (3D) models. Apart from correct model alignment and deformation, not all instruments are clearly visible in AR. Superimposition of a 3D model on top of the surgical stream, including the instruments, can result in a potentially hazardous surgical situation. We demonstrate real-time instrument detection during AR-guided robot-assisted partial nephrectomy and show the generalization of our algorithm to AR-guided robot-assisted kidney transplantation. We developed an algorithm using deep learning networks to detect all nonorganic items. This algorithm learned to extract this information for 65 927 manually labeled instruments on 15 100 frames. Our setup, which runs on a standalone laptop, was deployed in three different hospitals and used by four different surgeons. Instrument detection is a simple and feasible way to enhance the safety of AR-guided surgery. Future investigations should strive to optimize efficient video processing to minimize the 0.5-s delay currently experienced. General AR applications also need further optimization, including detection and tracking of organ deformation, for full clinical implementation.Copyright © 2023 European Association of Urology. Published by Elsevier B.V. All rights reserved.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *