Regulators struggle to keep up with the fast-moving and complicated landscape of AI therapy apps

29.09.2025    The Denver Post    2 views
Regulators struggle to keep up with the fast-moving and complicated landscape of AI therapy apps

By DEVI SHASTRI In the absence of stronger federal regulation specific states have begun regulating apps that offer AI therapy as more people turn to artificial intelligence for mental robustness advice But the laws all passed this year don t fully address the fast-changing landscape of AI application advancement And app developers policymakers and mental wellbeing advocates say the resulting patchwork of state laws isn t enough to protect users or hold the creators of harmful system accountable The reality is millions of people are using these tools and they re not going back stated Karin Andrea Stephan CEO and co-founder of the mental soundness chatbot app Earkick EDITOR S NOTE This story includes discussion of suicide If you or someone you know necessities help the national suicide and situation lifeline in the U S is available by calling or texting There is also an online chat at lifeline org The state laws take different approaches Illinois and Nevada have banned the use of AI to treat mental healthcare Utah placed certain limits on therapy chatbots including requiring them to protect users soundness information and to clearly disclose that the chatbot isn t human Pennsylvania New Jersey and California are also considering solutions to regulate AI therapy The impact on users varies Specific apps have blocked access in states with bans Others say they re making no changes as they wait for more legal clarity And multiple of the laws don t cover generic chatbots like ChatGPT which are not explicitly marketed for therapy but are used by an untold number of people for it Those bots have attracted lawsuits in horrific instances where users lost their grip on reality or took their own lives after interacting with them Vaile Wright who oversees vitality care innovation at the American Psychological Association explained the apps could fill a need noting a nationwide shortage of mental fitness providers high costs for care and uneven access for insured patients Mental wellness chatbots that are rooted in science created with expert input and monitored by humans could change the landscape Wright disclosed This could be something that helps people before they get to emergency she revealed That s not what s on the commercial sector at this time That s why federal regulation and oversight is needed she revealed Earlier this month the Federal Deal Commission broadcasted it was opening inquiries into seven AI chatbot companies including the parent companies of Instagram and Facebook Google ChatGPT Grok the chatbot on X Character AI and Snapchat on how they measure test and monitor potentially negative impacts of this equipment on children and teens And the Food and Drug Administration is convening an advisory committee Nov to review generative AI-enabled mental vitality devices Federal agencies could consider restrictions on how chatbots are marketed limit addictive practices require disclosures to users that they are not diagnostic providers require companies to track and assessment suicidal thoughts and offer legal protections for people who review bad practices by companies Wright noted Not all apps have blocked access From companion apps to AI therapists to mental wellness apps AI s use in mental medical care is varied and hard to define let alone write laws around That has led to different regulatory approaches Various states for example take aim at companion apps that are designed just for friendship but don t wade into mental wellness care The laws in Illinois and Nevada ban products that claim to provide mental medical recovery outright threatening fines up to in Illinois and in Nevada But even a single app can be tough to categorize Earkick s Stephan disclosed there is still a lot that is very muddy about Illinois law for example and the company has not limited access there Stephan and her gang initially held off calling their chatbot which looks like a cartoon panda a therapist But when users began using the word in reviews they embraced the terminology so the app would show up in searches Last week they backed off using therapy and clinical terms again Earkick s website described its chatbot as Your empathetic AI counselor equipped to sponsorship your mental wellbeing journey but now it s a chatbot for self care Still we re not diagnosing Stephan maintained Users can set up a panic button to call a trusted loved one if they are in predicament and the chatbot will nudge users to seek out a therapist if their mental robustness worsens But it was never designed to be a suicide prevention app Stephan revealed and police would not be called if someone advised the bot about thoughts of self-harm Stephan disclosed she s happy that people are looking at AI with a critical eye but worried about states ability to keep up with innovation The speed at which everything is evolving is massive she commented Other apps blocked access forthwith When Illinois users download the AI therapy app Ash a message urges them to email their legislators arguing misguided law has banned apps like Ash while leaving unregulated chatbots it intended to regulate free to cause harm A spokesperson for Ash did not respond to multiple requests for an interview Mario Treto Jr secretary of the Illinois Department of Financial and Professional Regulation declared the goal was ultimately to make sure licensed therapists were the only ones doing therapy Therapy is more than just word exchanges Treto revealed It requires empathy it requires clinical judgment it requires ethical responsibility none of which AI can truly replicate right now One chatbot app is trying to fully replicate therapy In March a Dartmouth College-based organization published the first known randomized clinical trial of a generative AI chatbot for mental soundness therapy The goal was to have the chatbot called Therabot treat people diagnosed with anxiety depression or eating disorders It was trained on vignettes and transcripts written by the group to illustrate an evidence-based response The examination uncovered users rated Therabot similar to a therapist and had meaningfully lower indicators after eight weeks compared with people who didn t use it Every interaction was monitored by a human who intervened if the chatbot s response was harmful or not evidence-based Nicholas Jacobson a clinical psychologist whose lab is leading the research commented the results indicated early promise but that larger studies are needed to demonstrate whether Therabot works for large numbers of people Related Articles Microsoft reduces Israel s access to cloud and AI products over reports of mass surveillance in Gaza The space is so dramatically new that I think the field requirements to proceed with much greater caution that is happening right now he noted A large number of AI apps are optimized for engagement and are built to help everything users say rather than challenging peoples thoughts the way therapists do Numerous walk the line of companionship and therapy blurring intimacy boundaries therapists ethically would not Therabot s organization sought to avoid those issues The app is still in testing and not widely available But Jacobson worries about what strict bans will mean for developers taking a careful approach He noted Illinois had no clear pathway to provide evidence that an app is safe and effective They want to protect folks but the traditional system right now is really failing folks he disclosed So trying to stick with the status quo is really not the thing to do Regulators and advocates of the laws say they are open to changes But the present day s chatbots are not a approach to the mental healthcare provider shortage declared Kyle Hillman who lobbied for the bills in Illinois and Nevada through his affiliation with the National Association of Social Workers Not everybody who s feeling sad demands a therapist he commented But for people with real mental wellness issues or suicidal thoughts telling them I know that there s a workforce shortage but here s a bot that is such a privileged position This story has been corrected to show that Therabot is not a company and to delete an incorrect reference to Dartmouth as a university The Associated Press Medical and Science Department receives sponsorship from the Howard Hughes Medicinal Institute s Department of Science Guidance and the Robert Wood Johnson Foundation The AP is solely responsible for all content

Similar News

Nikola Jokic on contract extension delay: Plan to be Nugget ‘forever’
Nikola Jokic on contract extension delay: Plan to be Nugget ‘forever’

Nikola Jokic is content to stay put. After delaying contract extension talks with the Nuggets this s...

29.09.2025 1
Read More
Missing the Valkyries? Stars will be playing in Unrivaled league
Missing the Valkyries? Stars will be playing in Unrivaled league

Good news for Golden State Valkyries fans: You won’t have to wait too long to see some of your favor...

29.09.2025 1
Read More
Cooper Foundation raises nearly $4 million at annual Red Hot Gala
Cooper Foundation raises nearly $4 million at annual Red Hot Gala

The Cooper Foundation's 15th Annual Red Hot Gala raised nearly $4 million to support healthcare prog...

29.09.2025 1
Read More