Open-source AI tool matches commercial systems in medical scan reporting

A new study from the University of Colorado Anschutz Medical Campus shows that free, open-source artificial intelligence (AI) tools can help doctors report medical scans just as well as more expensive commercial systems without putting patient privacy at risk.

The study was published today in the journal npj Digital Medicine.

The research highlights a promising and cost-effective alternative to widely known tools like ChatGPT which are often expensive and may require sending sensitive data to outside servers.

This is a big win for healthcare providers and patients. We've shown that hospitals don't need pricey or privacy-risky AI systems to get accurate results."

Aakriti Pandita, MD, lead author of the study and assistant professor of hospital medicine at the University of Colorado School of Medicine

Doctors often dictate notes or write free-text reports when reviewing medical scans like ultrasounds. These notes are valuable but they are not always in a format that is required for various clinical needs. Structuring this information helps hospitals track patient outcomes, spot trends and conduct research more efficiently. AI tools are increasingly used to make this process faster and more accurate.

But many of the most advanced AI systems, such as GPT-4 from OpenAI, require sending patient data across the internet to external servers. That's a problem in healthcare, where privacy laws make protecting patient data a top priority.

The new study found that free AI models, which can be used inside hospital systems without sending data elsewhere, perform just as well, and sometimes better, than commercial options.

The research team focused on a specific medical issue: thyroid nodules, lumps in the neck, often found during ultrasounds. Doctors use a scoring system called ACR TI-RADS to evaluate how likely these nodules are to be cancerous.

To train the AI tools without using real patient data, researchers created 3,000 fake, or "synthetic," radiology reports. These reports mimicked the kind of language doctors use but didn't contain any private information. The team then trained six different free AI models to read and score these reports.

They tested the models on 50 real patient reports from a public dataset and compared the results to commercial AI tools like GPT-3.5 and GPT-4. One open-source model, called Yi-34B, performed as well as GPT-4 when given a few examples to learn from. Even smaller models, which can run on regular computers, did better than GPT-3.5 in some tests.

"Commercial tools are powerful but they're not always practical in healthcare settings," said Nikhil Madhuripan, MD, senior author of the study and Interim Section Chief of Abdominal Radiology at the University of Colorado School of Medicine. "They're expensive and using them usually means sending patient data to a company's servers which can pose serious privacy concerns."

In contrast, open-source AI tools can run inside a hospital's own secure system. That means no sensitive information needs to leave the building and there's no need to buy large and expensive GPU clusters.

The study also shows that synthetic data can be a safe and effective way to train AI tools, especially when access to real patient records is limited. This opens the door to creating customized, affordable AI systems for many areas of healthcare.

The team hopes their approach can be used beyond radiology. In the future, Pandita said similar tools could help doctors review CT reports, organize medical notes or monitor how diseases progress over time.

"This isn't just about saving time," said Pandita. "It's about making AI tools that are truly usable in everyday medical settings without breaking the bank or compromising patient privacy."

Source:
Journal reference:

Pandita, A., et al. (2025). Synthetic data trained open-source language models are feasible alternatives to proprietary models for radiology reporting. npj Digital Medicine. doi.org/10.1038/s41746-025-01658-3.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Study: Nearly half of hospital toilet users skip handwashing