Fus2Net: a novel Convolutional Neural Network for classification of benign and malignant breast tumor in ultrasound images.

Researchers

Journal

Modalities

Models

Abstract

The rapid development of artificial intelligence technology has improved the capability of automatic breast cancer diagnosis, compared to traditional machine learning methods. Convolutional Neural Network (CNN) can automatically select high efficiency features, which helps to improve the level of computer-aided diagnosis (CAD). It can improve the performance of distinguishing benign and malignant breast ultrasound (BUS) tumor images, making rapid breast tumor screening possible.The classification model was evaluated with a different dataset of 100 BUS tumor images (50 benign cases and 50 malignant cases), which was not used in network training. Evaluation indicators include accuracy, sensitivity, specificity, and area under curve (AUC) value. The results in the Fus2Net model had an accuracy of 92%, the sensitivity reached 95.65%, the specificity reached 88.89%, and the AUC value reached 0.97 for classifying BUS tumor images.The experiment compared the existing CNN-categorized architecture, and the Fus2Net architecture we customed has more advantages in a comprehensive performance. The obtained results demonstrated that the Fus2Net classification method we proposed can better assist radiologists in the diagnosis of benign and malignant BUS tumor images.The existing public datasets are small and the amount of data suffer from the balance issue. In this paper, we provide a relatively larger dataset with a total of 1052 ultrasound images, including 696 benign images and 356 malignant images, which were collected from a local hospital. We proposed a novel CNN named Fus2Net for the benign and malignant classification of BUS tumor images and it contains two self-designed feature extraction modules. To evaluate how the classifier generalizes on the experimental dataset, we employed the training set (646 benign cases and 306 malignant cases) for tenfold cross-validation. Meanwhile, to solve the balance of the dataset, the training data were augmented before being fed into the Fus2Net. In the experiment, we used hyperparameter fine-tuning and regularization technology to make the Fus2Net convergence.© 2021. The Author(s).

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *