An Adaptive Digital Stain Separation Method for Deep Learning-based Automatic Cell Profile Counts.

Researchers

Journal

Modalities

Models

Abstract

Quantifying cells in a defined region of biological tissue is critical for many clinical and preclinical studies, especially in the fields of pathology, toxicology, cancer and behavior. As part of a program to develop accurate, precise and more efficient automatic approaches for quantifying morphometric changes in biological tissue, we have shown that both deep learning-based and hand-crafted algorithms can estimate the total number of histologically stained cells at their maximal profile of focus in extended depth of field (EDF) images. Deep learning-based approaches show accuracy comparable to manual counts on EDF images but significant enhancement in reproducibility, throughput efficiency and reduced error from human factors. However, a majority of the automated counts are designed for single-immunostained tissue sections. New Method: To expand the automatic counting methods to more complex dual-staining protocols, we developed an adaptive method to separate stain color channels on images from tissue sections stained by a primary immunostain with secondary counterstain. Comparison with Existing Methods: The proposed method overcomes the limitations of the state-of-the-art stain-separation methods, like the requirement of pure stain color basis as a prerequisite or stain color basis learning on each image.
Experimental results are presented for automatic counts using deep learning-based and hand-crafted algorithms for sections immunostained for neurons (Neu-N) or microglial cells (Iba-1) with cresyl violet counterstain.
Our findings show more accurate counts by deep learning methods compared to the handcrafted method. Thus, stain-separated images can function as input for automatic deep learning-based quantification methods designed for single-stained tissue sections.
Copyright © 2021. Published by Elsevier B.V.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *