|

Simple Python Module for Conversions between DICOM Images and Radiation Therapy Structures, Masks, and Prediction Arrays.

Researchers

Journal

Modalities

Models

Abstract

Deep learning is becoming increasingly popular and available to new users, particularly in the medical field. Deep learning image segmentation, outcome analysis, and generators rely on presentation of Digital Imaging and Communications in Medicine (DICOM) images, and often radiation therapy (RT) structures as masks. While the technology to convert DICOM images and RT-Structures into other data types exists, no purpose-built Python module for converting NumPyarrays into RT-Structures exists. The two most popular deep learning libraries, Tensorflow and PyTorch, are both implemented within Python, and we believe a set of tools built in Python for manipulating DICOM images and RT-Structures would be useful and could save medical researchers large amounts of time and effort during the pre-processing and prediction steps. Our module provides intuitive methods for rapid data curation of RT-Structure files by identifying unique region of interest (ROI) names, ROI structure locations, and allowing multiple ROI names to represent the same structure. It is also capable of converting DICOM images and RT-Structures into NumPy arrays and SimpleITK Images, the most commonly used formats for image analysis and inputs into deep learning architectures, and radiomic feature calculations. Furthermore, the tool provides a simple method for creating a DICOM RT-Structure from predicted NumPy arrays, which are commonly the output of semantic segmentation deep learning models. Accessing DicomRTTool via the public Github project invites open collaboration, while the deployment of our module in PyPi ensures painless distribution and installation. We believe our tool will be increasingly useful as deep learning in medicine progresses.
Copyright © 2021 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *