Experts highlight the impact of unregulated health and AI apps for substance use reduction

In a commentary published by the Journal of the American Medical Association, researchers at Rutgers Health, Harvard University and the University of Pittsburgh discuss the impact of unregulated mobile health and generative artificial intelligence (AI) applications that claim to assist in substance use reduction.  

Jon-Patrick Allem, a member of the Rutgers Institute for Nicotine and Tobacco Studies, an associate professor at the Rutgers School of Public Health and senior author of the commentary, focuses on the need for greater oversight of new and untested technologies, such as mobile health and generative AI applications, and why public marketplaces need better rules to manage them.

Allem discusses the notion that greater transparency and stricter regulation could safeguard people from being misled by information portrayed as verifiable public health information.

What are the issues with substance use reduction mobile health apps?

Research shows that some mobile health apps can help people cut back on substance use, like alcohol, at least in controlled studies. But in the real world, their impact is limited.

App stores often promote products that generate revenue through ads rather than those backed by science, so the most visible apps are sometimes untested or misleading due to the prioritization of ad revenue.

As a result, evidence-based apps may be harder to find. Systematic reviews of substance use reduction apps consistently show that most fail to use proven evidence-based approaches. Instead, they often make bold claims about how effective they are and use scientific-sounding language to seem more credible than they are.

How do you know if an app is evidence-based? 

To know whether an app is evidence-based, consumers can look for specific signs that the app is built on proven research, not just marketing claims. A few indicators would be that the app cites scientific research (like a peer-reviewed study), the app was developed by experts in the area (built in collaboration with a University or licensed clinician or professional organization), the app has been independently evaluated (published evaluations in scientific journals), the app follows strict data standards (clear explanation of how data is stored, complies with regulations like HIPAA) and/or the app is free from exaggerated promises (like guaranteed results).

What is the current landscape of regulation and enforcement in the app marketplace?

As of now, there is a massive lack of enforcement. Because so many health-related claims made by mobile applications are unsubstantiated, this leaves huge populations of people vulnerable to misinformation, which can hinder the treatment and chances of recovery for individuals with a substance use disorder.

What are your concerns with using generative AI for substance use reduction apps?

Generative AI’s integration into the health mobile app has been flooding the marketplace with unregulated and untested products because of the rapid development and output from generative AI tools. 

Although general purpose models such as ChatGPT have demonstrated that access to accurate health information is potentially increasing, there are major safety lapses. These safety lapses can range from providing inaccurate health information to failing to respond appropriately to crisis situations, to normalizing unsafe behaviors. 

What can consumers do to best protect themselves from unregulated health apps?

Consumers should avoid apps that use vague phrases like “clinically proven” without specific details or references, or apps that use methods that seem overly simple or too good to be true.

In what ways could we strengthen its oversight of generative AI?

One potentially promising way to regulate today’s health app marketplace is to make Food and Drug Administration approval a requirement. This means apps should go through randomized clinical trials and meet a defined standard before becoming available to the public. 

Until then, clear labeling is key since people need to know which apps are backed by evidence and which are not. With the right safeguards and enforcement mechanisms in place, like fines, suspensions or removal of noncompliant products from app stores, we can make sure that mobile health apps are accurate, safe, and responsible. 

Source:
Journal reference:

Russell, A. M., et al. (2025). The Need for Oversight Over Apps for Substance Use Reduction. JAMA. doi: 10.1001/jama.2025.19143. https://jamanetwork.com/journals/jama/article-abstract/2840575

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
American Cancer Society releases first-ever U.S. Tobacco Atlas