Mar 23 2006
Every day in hospitals around the country, thousands of patients undergo CT, MRI, X-ray and other kinds of scans, producing detailed images of their bodies. Specially trained doctors "read" those images to look for problems, and then send a report of what they've found to each patient's own doctor.
But every once in a while, a new study finds, a patient falls through the cracks -- the victim of an incomplete handoff between doctors. If that patient's scan happens to show signs of cancer or another serious problem, the results could be disastrous. Fortunately, the study also shows, it may be possible to prevent such occurrences.
In a paper in the April issue of the American Journal of Roentgenology, a prominent journal for medical-imaging specialists, or radiologists, researchers from the University of Michigan Health System and the VA Ann Arbor Healthcare System report the results from their first year using an innovative automatic system at the Ann Arbor VA hospital.
In all, they show, the system kept eight patients with serious signs of trouble on their scans from falling through the cracks, including five who turned out to have cancer. And while the handoff between radiologists and other physicians took place correctly for the vast majority of 395 patients whose scans revealed potential cancers, the authors say their findings show the value of an inexpensive "safety net" system to catch those few patients who might otherwise be missed.
"We know anecdotally that these problems happen around the country, and in fact they are the source of abundant malpractice litigation," says author Charles Marn, M.D., chief of radiology at the Ann Arbor VA and an associate professor of radiology at the U-M Medical School. "We developed this system after a situation that occurred at our own institution, and this one-year experience already shows that it has helped. We hope that other hospitals can use these findings to develop their own responses to this issue, especially as they implement computerized radiology systems."
Marn and his colleagues, including lead author and U-M radiology lecturer Vaishali Choksi, M.B.B.S., D.M.R.D., D.N.B., developed a system of codes that radiologists could assign to each medical image as electronic "tags." The study focused on scans that received a "Code 8" tag, meaning that the radiologist spotted an unexpected sign of cancer that required immediate follow-up by the patient's own physician.
Such scans were reported to the patient's physician via a written report and a direct phone notification about the unexpected finding that might indicate cancer. But as a backup, each week a staff member pulled up the computerized records tagged with Code 8s and checked if each had received follow-up care. If they hadn't, she contacted the patient's physician, as well as the hospital's cancer-care group.
Of the 37,736 medical images made at the VA in the one-year study period, 395 received Code 8s, and 360 of those patients' computerized records showed that they had appropriate follow-up within two weeks.
For the 35 patients whose records showed no sign of follow-up, the staff member's contact with the doctors revealed that there had been follow-up for 25 of the patients, but it hadn't been noted in the computerized records yet. One other patient died soon after the Code 8 scan, and another elected not to have follow-up care.
But for eight patients, the doctor who had ordered the scan had not reacted to the Code 8 report from the radiologist, for whatever reason. Once follow-up care was initiated, five of those patients turned out to have malignant cancer, making up 2 percent of all cancers detected in the study year and 0.02 percent of all scans performed during the year.
Why would doctors fail to react to a radiologist's report about a potential cancer? There are many reasons, Marn says. For instance, an unexpected finding of cancer on a scan that had been ordered for an entirely different reason -- for example, to guide a surgeon performing a hip operation -- might not get immediate attention from the surgeon.
Or, the medical resident who ordered the scan originally might have finished his or her rotation in the hospital by the time the scan results came back, and the resident's replacement might not immediately process the report. Or, they say, the report might just simply get lost in the "crush of clinical information" that bombards physicians each day.
That's why the automated coding and reporting system developed at the Ann Arbor VA could be so useful in any hospital, Marn explains. The rapid increase in medical imaging in recent years, combined with the increased use of computerized medical records systems and digital medical-image systems called PACS, means the time is right to use digital technology to keep patients from falling through the cracks.
Missed follow-up on cancer scans is just one example of a patient-safety issue that stems from inadequate communication, incomplete handoffs between professionals and systemic "holes" that patients can slip through, Marn adds.
The researchers also collected data on patients with "Code 4" tags on their medical images, meaning that the radiologist had spotted something on the scan that wasn't cancer but might indicate another problem. They are now analyzing data from those patients.
Importantly, Marn says, the new paper also shows that no patients whose medical-imaging scans were ordered by emergency or urgent-care doctors fell through the cracks. However, the study was not able to determine if there were any differences in follow-up care between residents and attending physicians, or residents at different stages of their training.