|

Deep learning to obtain simultaneous image and segmentation outputs from a single input of raw ultrasound channel data.

Researchers

Journal

Modalities

Models

Abstract

Single plane wave transmissions are promising for automated imaging tasks requiring high ultrasound frame rates over an extended field of view. However, a single plane wave insonification typically produces suboptimal image quality. To address this limitation, we are exploring the use of deep neural networks (DNNs) as an alternative to delay-and-sum beamforming. The objectives of this work are to obtain information directly from raw channel data and to simultaneously generate both a segmentation map for automated ultrasound tasks and a corresponding ultrasound B-mode image for interpretable supervision of the automation. We focus on visualizing and segmenting anechoic targets surrounded by tissue and ignoring or de-emphasizing less important surrounding structures. DNNs trained with Field II simulations were tested with simulated, experimental phantom, and in vivo datasets that were not included during training. With unfocused input channel data (i.e., prior to the application of receive time delays), simulated, experimental phantom, and in vivo test datasets achieved mean ± standard deviation Dice similarity coefficients of 0.92±0.13, 0.92±0.03, and 0.77±0.07, respectively, and generalized contrast-to-noise ratios (gCNR) of 0.95±0.08, 0.93±0.08, and 0.75±0.14, respectively. With subaperture beamformed channel data and a modification to the input layer of the DNN architecture to accept these data, the fidelity of image reconstruction increased (e.g., mean gCNR of multiple acquisitions of two in vivo breast cysts ranged 0.89-0.96), but DNN display frame rates were reduced from 395 Hz to 287 Hz. Overall, the DNNs successfully translated feature representations learned from simulated data to phantom and in vivo data, which is promising for this novel approach to simultaneous ultrasound image formation and segmentation.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *