ConvPath: A software tool for lung adenocarcinoma digital pathological image analysis aided by a convolutional neural network.

Researchers

Journal

Modalities

Models

Abstract

The spatial distributions of different types of cells could reveal a cancer cell’s growth pattern, its relationships with the tumor microenvironment and the immune response of the body, all of which represent key “hallmarks of cancer”. However, the process by which pathologists manually recognize and localize all the cells in pathology slides is extremely labor intensive and error prone.
In this study, we developed an automated cell type classification pipeline, ConvPath, which includes nuclei segmentation, convolutional neural network-based tumor cell, stromal cell, and lymphocyte classification, and extraction of tumor microenvironment-related features for lung cancer pathology images. To facilitate users in leveraging this pipeline for their research, all source scripts for ConvPath software are available at https://qbrc.swmed.edu/projects/cnn/.
The overall classification accuracy was 92.9% and 90.1% in training and independent testing datasets, respectively. By identifying cells and classifying cell types, this pipeline can convert a pathology image into a “spatial map” of tumor, stromal and lymphocyte cells. From this spatial map, we can extract features that characterize the tumor micro-environment. Based on these features, we developed an image feature-based prognostic model and validated the model in two independent cohorts. The predicted risk group serves as an independent prognostic factor, after adjusting for clinical variables that include age, gender, smoking status, and stage.
The analysis pipeline developed in this study could convert the pathology image into a “spatial map” of tumor cells, stromal cells and lymphocytes. This could greatly facilitate and empower comprehensive analysis of the spatial organization of cells, as well as their roles in tumor progression and metastasis.
Copyright © 2019 The Authors. Published by Elsevier B.V. All rights reserved.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *