| |

Domain affiliated distilled knowledge transfer for improved convergence of Ph-negative MPN identifier.

Researchers

Journal

Modalities

Models

Abstract

Ph-negative Myeloproliferative Neoplasm is a rare yet dangerous disease that can turn into more severe forms of disorders later on. Clinical diagnosis of the disease exists but often requires collecting multiple types of pathologies which can be tedious and time-consuming. Meanwhile, studies on deep learning-based research are rare and often need to rely on a small amount of pathological data due to the rarity of the disease. In addition, the existing research works do not address the data scarcity issue apart from using common techniques like data augmentation, which leaves room for performance improvement. To tackle the issue, the proposed research aims to utilize distilled knowledge learned from a larger dataset to boost the performance of a lightweight model trained on a small MPN dataset. Firstly, a 50-layer ResNet model is trained on a large lymph node image dataset of 3,27,680 images, followed by the trained knowledge being distilled to a small 4-layer CNN model. Afterward, the CNN model is initialized with the pre-trained weights to further train on a small MPN dataset of 300 images. Empirical analysis showcases that the CNN with distilled knowledge achieves 97% accuracy compared to 89.67% accuracy achieved by a clone CNN trained from scratch. The distilled knowledge transfer approach also proves to be more effective than more simple data scarcity handling approaches such as augmentation and manual feature extraction. Overall, the research affirms the effectiveness of transferring distilled knowledge to address the data scarcity issue and achieves better convergence when training on a Ph-Negative MPN image dataset with a lightweight model.Copyright: © 2024 Reza et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *