AI used to identify pancreatic cancer in CT scans

NewsGuard 100/100 Score

In a recent study published in the journal Radiology, researchers in Taiwan developed a deep learning (DL)–based computer-aided detection (CAD) tool to detect pancreatic cancer on contrast-enhanced abdominal computed tomography (CT) scans.

Study: Pancreatic Cancer Detection on CT Scans with Deep Learning: A Nationwide Population-based Study. Image Credit: Suttha Burawonk / ShutterstockStudy: Pancreatic Cancer Detection on CT Scans with Deep Learning: A Nationwide Population-based Study. Image Credit: Suttha Burawonk / Shutterstock


Pancreatic cancer patients have the lowest five-year survival rate; projections show it will emerge as the second foremost cause of cancer death in the United States by 2030. In addition, pancreatic cancer prognosis worsens quickly once the tumor grows more prominent than 2 cm, thus, necessitating early detection.

Currently, pancreatic cancer diagnosis via CT misses nearly 40% of tumors less than 2 cm and is also hampered by disparities in radiologist expertise. Indeed, there is an urgent unmet medical need for tools that could empower radiologists manually analyze the segmentation of the pancreas to improve the sensitivity of pancreatic cancer detection. Moreover, in pancreatic cancer patients, segmentation or identification of the pancreas is challenging as it varies in size and shape and borders multiple other organs and structures.

In one of their previous works, the researchers demonstrated that a DL-based convolutional neural network (CNN) could accurately distinguish pancreatic cancer from the non-cancerous pancreas.

About the study

In the present study, researchers tested and validated a similar computer-aided detection (CAD) tool that harbored CNN for segmenting the pancreas on CT images. Additionally, this tool had an ensemble classifier with five independent classification CNNs to predict the presence of pancreatic cancer. They obtained all the CT scans analyzed in the portal venous phase, 70–80 seconds after intravenous administration of the contrast medium.

Training and validation datasets and local and nationwide test datasets were used in the study. The team randomly divided pancreatic cancer patients in an 8:2 ratio into the training and validation set and the local test set, respectively. They prospectively collected CT studies of 546 patients with pancreatic cancer diagnosed between January 2006 and July 2018 from clinical practices in Taiwan, which formed their local dataset. These patients were 18 years or older with confirmed pancreatic adenocarcinoma with findings registered in the national cancer registry. The control group for the local dataset comprised CT studies of 1,465 individuals with normal pancreas collected between January 2004 and December 2019. 

The researchers searched the registry of the National Health Insurance (NHI) Major Illness Certificate to retrieve CT studies of 669 patients with newly diagnosed pancreatic cancer between January 2018 and July 2019. Likewise, they extracted CT studies of 72 kidney and liver donors during the same time from the NHI database, which formed the control group. They further combined these two with CT studies of 732 control subjects from the tertiary referral center imaging archive of the NHI database to create the nationwide test dataset of the current study.

Lastly, the team trained the five classification CNNs on other subsets of the training and validation sets retrieved from the tertiary referral center of the NIH database, which had CT studies of 437 pancreatic cancer patients and 586 controls. Only when the number of positive-predicting CNNs was equal to or greater than the smallest number yielding a positive likelihood ratio (LR) greater than one in the validation, set the researchers considered that CT showed pancreatic cancer.

The researchers evaluated the performance of the segmentation CNN with Dice score per patient. Likewise, they assessed the performance of classification CNNs based on their respective sensitivity, specificity, and accuracy. The team calculated the area under the receiver operating characteristic curve (AUC) and LR. Finally, they used the McNemar test to compare the sensitivities of the CAD tool and radiologist interpretation.

Study findings

In the internal test set, the CAD tool sensitivity and specificity to distinguish between CT malignancies and control studies were 89.7% and 92.8%, respectively, with nearly 75% sensitivity for pancreatic cancers smaller than 2 cm. Overall, it demonstrated high robustness and generalizability. Intriguingly, the CAD tool sensitivity was comparable to attending radiologists of a tertiary academic institution with a large volume of pancreatic cancer patients (90.2% vs. 96.1%), indicating that this tool might have higher sensitivity than less experienced radiologists. It might help reduce the miss rate attributed to disparities in radiologist expertise.

Furthermore, the tool appeared feasible for clinical deployment because it provides ample information to assist clinicians. It determined whether the images showed pancreatic cancer. Also, it indicated the possible location of the tumor to help radiologists quickly interpret the results. Notably, in ~90% of pancreatic cancers accurately identified by the CAD tool, the segmentation CNNs correctly pinpoint the tumor location. Furthermore, the CAD tool provided the positive LR, a measure of the confidence of pancreatic cancer vs. non- pancreatic cancer classification to better inform the subsequent diagnostic-therapeutic process than a simple binary classification.

Secondary signs in the non-tumorous portion of the pancreas, including pancreatic duct dilatation, upstream pancreatic parenchymal atrophy, and abrupt cutoff of the pancreatic duct, are clues to occult pancreatic cancers. A good diagnostic tool should be able to leverage these signs in the detection process. In the current study, the classification  CNNs correctly classified two pancreatic cancer cases by analyzing the non-tumorous portion of the pancreas only by learning the secondary signs of pancreatic cancer spontaneously from examples.


The novel CAD tool used in the current study showed the potential to supplement radiologists for early and accurate detection of pancreatic cancers on CT scans. However, the finding that the classification CNNs might have learned the secondary signs of pancreatic cancer requires further investigation. Likewise, future studies should test the performance of this CAD tool in populations other than Asians (and Taiwanese) to gather data supporting its generalizability.

Journal reference:
Neha Mathur

Written by

Neha Mathur

Neha is a digital marketing professional based in Gurugram, India. She has a Master’s degree from the University of Rajasthan with a specialization in Biotechnology in 2008. She has experience in pre-clinical research as part of her research project in The Department of Toxicology at the prestigious Central Drug Research Institute (CDRI), Lucknow, India. She also holds a certification in C++ programming.


Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Mathur, Neha. (2022, September 15). AI used to identify pancreatic cancer in CT scans. News-Medical. Retrieved on May 27, 2024 from

  • MLA

    Mathur, Neha. "AI used to identify pancreatic cancer in CT scans". News-Medical. 27 May 2024. <>.

  • Chicago

    Mathur, Neha. "AI used to identify pancreatic cancer in CT scans". News-Medical. (accessed May 27, 2024).

  • Harvard

    Mathur, Neha. 2022. AI used to identify pancreatic cancer in CT scans. News-Medical, viewed 27 May 2024,


The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
AI chatbots outperform doctors in empathy and readability for cancer-related questions, study finds