Deep causal learning for pancreatic cancer segmentation in CT sequences.
Researchers
Journal
Modalities
Models
Abstract
Segmenting the irregular pancreas and inconspicuous tumor simultaneously is an essential but challenging step in diagnosing pancreatic cancer. Current deep-learning (DL) methods usually segment the pancreas or tumor independently using mixed image features, which are disrupted by surrounding complex and low-contrast background tissues. Here, we proposed a deep causal learning framework named CausegNet for pancreas and tumor co-segmentation in 3D CT sequences. Specifically, a causality-aware module and a counterfactual loss are employed to enhance the DL network’s comprehension of the anatomical causal relationship between the foreground elements (pancreas and tumor) and the background. By integrating causality into CausegNet, the network focuses solely on extracting intrinsic foreground causal features while effectively learning the potential causality between the pancreas and the tumor. Then based on the extracted causal features, CausegNet applies a counterfactual inference to significantly reduce the background interference and sequentially search for pancreas and tumor from the foreground. Consequently, our approach can handle deformable pancreas and obscure tumors, resulting in superior co-segmentation performance in both public and real clinical datasets, achieving the highest pancreas/tumor Dice coefficients of 86.67%/84.28%. The visualized features and anti-noise experiments further demonstrate the causal interpretability and stability of our method. Furthermore, our approach improves the accuracy and sensitivity of downstream pancreatic cancer risk assessment task by 12.50% and 50.00%, respectively, compared to experienced clinicians, indicating promising clinical applications.Copyright © 2024 Elsevier Ltd. All rights reserved.