Study explores the role of health care algorithms in racial and ethnic disparities

NewsGuard 100/100 Score

For years, it was harder for Black patients to secure a coveted spot on the national kidney transplant waitlist because a clinical algorithm was making Black patients appear healthier than they were. After a Penn Medicine researcher exposed the problem in 2019-;and showed how it exacerbated racial disparities in kidney disease-;a national taskforce recommended removing race from the algorithm's scoring, a move that has quickly been adopted throughout the country in an effort to reduce racial inequity.

But that wasn't the only impact, according to a comprehensive new study by Penn researchers that digs deeper into the complicated issue of race and ethnicity in health care algorithms. Removing race from the kidney function algorithm also appeared to reduce chemotherapy access, reduce eligibility for Black patients in clinical trials and affect medication dosing.

The new paper, published this month in the Annals of Internal Medicine, paints a nuanced picture of algorithms in health care-;a ubiquitous, but often unseen, force in clinical decision making-;and how their use can impact racial and ethnic disparities. The research team, led by Shazia Mehmood Siddique, MD, an assistant professor of Gastroenterology in Penn's Perelman School of Medicine, found that algorithms can mitigate, perpetuate, and exacerbate racial and ethnic disparities, regardless of whether they explicitly use race or ethnicity as an input.

"Intentionality matters," said Siddique, who also serves as director for research for Penn Medicine's Center for Evidence-Based Practice (CEP) and also the Penn Center for Healthcare Improvement and Patient Safety (CHIPS). "Racial and ethnic disparities cannot be an afterthought."

The researchers defined algorithms as mathematical equations that combined various data points and inputs, such as sex and age, to inform clinical care. Algorithms are embedded throughout health care, Siddique said, to help providers make complex clinical decisions, such as whether a patient should be diagnosed with a disease or is eligible for a particular treatment, and to help health systems determine how to allocate resources, such as care management and critical care services. Importantly, algorithms are often embedded in electronic health records and patients and clinicians themselves are often not always aware they are being utilized. 

To patients, it might seem as if medical criteria such as risk scores or treatment thresholds are based entirely on objective factors, said study co-author Brian Leas, a senior research analyst at CEP. But algorithms introduce a social component to clinical decision making. "How algorithms are constructed is a choice made by the developers," he said. "It's a decision to put certain factors together into a formula, and those decisions can be made differently."

Traditionally, health care algorithms were developed by researchers in academic settings. More recently, however, health systems, insurance companies, and electronic health record companies have begun to develop their own algorithms, while artificial intelligence tools have rapidly assumed a major role in fueling new algorithms -;a shift that researchers said heightens the need for scrutiny.

In 2020, health care algorithms caught the attention of four U.S. senators, including New Jersey's Cory Booker. Citing the detrimental kidney disease algorithm, as well as another that unjustly lowered settlement benefits for concussion-related injuries for Black players in the National Football League, the lawmakers called on the Agency for Healthcare Research and Quality in the U.S. Department of Health and Human Services to conduct a review of race-based clinical algorithms in medical practice. The agency subsequently commissioned the Penn team's study, which was conducted in collaboration with ECRI, a nonprofit healthcare research organization focused on patient safety and reducing preventable harm.

In a systematic review of 63 studies, the researchers found that there's no silver bullet fix for issues associated with algorithms. Instead, they identified several strategies to mitigate disparities in health care algorithms, including adding a non-race variable, using data that reflects diverse racial and ethnic groups when developing algorithms, and swapping race with another more precise variable, such as genetic data or social factors that may impact care.

Health care algorithms are most successful at reducing disparities when they intentionally reduce documented inequalities, Siddique said. In some cases, this meant including race as an algorithm component. For instance, a prostate cancer screening algorithm was found to overtest Black men, which led to unnecessary biopsies and complications. Adding Black race as an input in the algorithm mitigated the disparity.

But sometimes, as with the kidney disease algorithm, reducing disparities means compromising other outcomes. When race was removed as a variable from an algorithm for lung cancer screening eligibility, disparities in eligibility for Hispanic and Asian Americans improved, while disparities deepened for Black patients.

A value judgment is needed when we are making a decision about these tradeoffs in outcomes. Is it worth this downside that we're going to see for this potential benefit? How can we further refine the algorithm to minimize disparities across groups?"

Shazia Mehmood Siddique, MD, Assistant Professor of Gastroenterology, Penn's Perelman School of Medicine

Researchers found that instead of removing race from the lung cancer eligibility algorithm, refining the model by allowing race and ethnicity to remain if a patient's only reason for ineligibility was low life expectancy due to their race or ethnicity ultimately improved lung cancer eligibility disparities across the board. 

Race is often used in algorithms as a proxy for another variable, such as ancestry, a specific gene, social determinants of health or even the effects of systemic racism, Siddique said. The problem, she added, is that it is often unclear why race is being used in an algorithm. "We need algorithm developers to be clear about what race is being used as a proxy for, because clinicians may have no idea," Siddique said. "If there is no transparency about it, then it can perpetuate the false assumption that race is biologic."

A better option: replace race with a more precise variable. Siddique is now studying, for instance, whether replacing race with country of origin in a liver cancer screening guideline would reduce disparities. (While similar to algorithms, guidelines are non-mathematical, evidence-based recommendations typically developed by medical associations to help guide best clinical practices.)

The conversation about algorithms in healthcare-;and their impact on racial and ethnic disparities-;is just beginning. In response to the Penn-ECRI research, a diverse panel of health care experts convened late last year to offer guidance on mitigating algorithm bias. The group recommended guiding principles to support health care equity in the algorithm development and review process.

In a commentary in Health Affairs published in October 2023, Siddique and others promoted a "race-aware" -- rather than race-based -- approach to algorithms, calling for increased diversity in clinical trials, a focus on precision medicine and improved education on the factors that shape health outcomes.

The study was funded by the Agency for Healthcare Quality and Research.

Source:
Journal reference:

Siddique, S. M., et al. (2024) The Impact of Health Care Algorithms on Racial and Ethnic Disparities: A Systematic Review. Annals of Internal Medicine. doi.org/10.7326/M23-2960.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Breakthrough imaging method enhances precision in prostate cancer treatment