BERT in Radiology: A Systematic Review of Natural Language Processing Applications.

Researchers

Journal

Modalities

Models

Abstract

BERT (Bidirectional Encoder Representations from Transformers), introduced in 2018, has revolutionized natural language processing (NLP). Its bidirectional understanding of word context has enabled innovative applications, notably in radiology. This study aimed to assess BERT’s influence and applications within the radiologic domain.Adhering to PRISMA guidelines, we conducted a systematic review, searching PubMed for literature on BERT-based models and NLP in radiology from January 1, 2018, to February 12, 2023. The search encompassed keywords related to generative models, transformer architecture, and various imaging techniques.Of 597 results, 30 met our inclusion criteria. The remaining were unrelated to radiology or did not utilize BERT-based models. The included studies were retrospective, with 14 published in 2022. The primary focus was on classification and information extraction from radiology reports, with x-rays as the prevalent imaging modality. Specific investigations included automatic CT protocol assignment and deep learning applications in chest x-ray interpretation.This review underscores the primary application of BERT in radiology for report classification. It also reveals emerging BERT applications for protocol assignment and report generation. As BERT technology advances, we foresee further innovative applications. Its implementation in radiology holds potential for enhancing diagnostic precision, expediting report generation, and optimizing patient care.Copyright © 2024. Published by Elsevier Inc.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *