ChatGPT shows promise in polypharmacy management

NewsGuard 100/100 Score

Polypharmacy, or the concurrent use of five or more medications, is common in older adults and increases the risk of adverse drug interactions. While deprescribing unnecessary drugs can combat this risk, the decision-making process can be complex and time-consuming. Increasingly, there is a need for effective polypharmacy management tools that can support short-staffed primary care practitioners.

In a new study, researchers from the Mass General Brigham MESH Incubator found that ChatGPT, a generative artificial intelligence (AI) chatbot, showed promise as a tool to manage polypharmacy and deprescription. These findings, published April 18th in the Journal of Medical Systems, demonstrate the first use case of AI models in medicine management.

To evaluate its utility, the investigators provided ChatGPT with different clinical scenarios and asked it a set of decision-making questions. Each scenario featured the same elderly patient taking a mixture of medications but included variations in cardiovascular disease history (CVD) and degree of impairment in activities of daily living (ADL).

When asked yes or no questions about reducing prescribed drugs, ChatGPT consistently recommended deprescribing medications in patients without a history of CVD. However, it was more cautious when overlying CVD was introduced, and more likely to keep the patient's medication regimen unchanged. In both cases, the researchers observed that ADL impairment severity did not seem to affect decision outcomes.

The team also noted that ChatGPT had a tendency to disregard pain and favored deprescribing pain medications over other drug types like statins or antihypertensives. In addition, ChatGPT responses varied when presented with the same scenario in new chat sessions -; which the authors suggest could reflect inconsistency in commonly reported clinical deprescribing trends on which the model was trained.

More than 40 percent of older adults meet the criteria for polypharmacy. The rate of seniors on Medicare seeing more specialists on their care teams has increased in recent years, leaving primary care providers to oversee medication management. An effective AI tool could help aid this practice, according to the researchers. 

Our study provides the first use case of ChatGPT as a clinical support tool for medication management. While caution should be taken to increase accuracy of such models, AI-assisted polypharmacy management could help alleviate the increasing burden on general practitioners. Further research with specifically trained AI tools may significantly enhance the care of aging patients."

Marc Succi, MD, Senior Corresponding Author, Associate Chair of Innovation and Commercialization at Mass General Brigham Radiology and Executive Director of the MESH Incubator

Arya Rao, lead author, MESH researcher and Harvard Medical student, added "Our findings suggest that AI-based tools can play an important role in ensuring safe medication practices for older adults; it is imperative that we continue to refine these tools to account for the complexities of medical decision-making."

Source:
Journal reference:

Rao, A., et al. (2024) Proactive Polypharmacy Management Using Large Language Models: Opportunities to Enhance Geriatric Care. Journal of Medical Systems. doi.org/10.1007/s10916-024-02058-y.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Opportunities and limitations of using a large language model to respond to patient messages