Segmentation and Tracking of Mammary Epithelial Organoids in Brightfield Microscopy.

Researchers

Journal

Modalities

Models

Abstract

We present an automated and deep-learning-based workflow to quantitatively analyze the spatiotemporal development of mammary epithelial organoids in two-dimensional time-lapse (2D+t) sequences acquired using a brightfield microscope at high resolution. It involves a convolutional neural network (U-Net), purposely trained using computer-generated bioimage data created by a conditional generative adversarial network (pix2pixHD), to infer semantic segmentation, adaptive morphological filtering to identify organoid instances, and a shape-similarity-constrained, instance-segmentation-correcting tracking procedure to reliably cherry-pick the organoid instances of interest in time. By validating it using real 2D+t sequences of mouse mammary epithelial organoids of morphologically different phenotypes, we clearly demonstrate that the workflow achieves reliable segmentation and tracking performance, providing a reproducible and laborless alternative to manual analyses of the acquired bioimage data.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *