Gadget

Research shows AI should not replace
your therapist

Should AI chatbots replace your therapist? New research says “no.” 

The new study exposes the dangerous flaws in using artificial intelligence (AI) chatbots for mental health support. For the first time, the researchers evaluated these AI systems against clinical standards for therapists.

The research, recently published and presented at the Association for Computing Machinery Conference on Fairness, Accountability, and Transparency (ACM FAccT), was a multi-disciplinary collaboration including researchers at the Stanford Institute for Human-Centered Artificial Intelligence, Carnegie Mellon University, University of Minnesota Twin Cities, and University of Texas at Austin.

In recent years, more people are turning to AI chatbots, like ChatGPT, for mental health support because of decreasing access and increasing costs of mental health services.

“Our experiments show that these chatbots are not safe replacements for therapists. They don’t provide high-quality therapeutic support, based on what we know is good therapy,” said Stevie Chancellor, an assistant professor in the University of Minnesota Twin Cities Department of Computer Science and Engineering and co-author of the study.

Other findings included:

“Our research shows these systems aren’t just inadequate—they can actually be harmful,” wrote Kevin Klyman, a researcher with the Stanford Institute for Human-Centered Artificial Intelligence and co-author on the paper. “This isn’t about being anti-AI in healthcare. It’s about ensuring we don’t deploy harmful systems while pursuing innovation. AI has promising supportive roles in mental health, but replacing human therapists isn’t one of them.”

In addition to Chancellor and Klyman, the team included Jared Moore, Declan Grabb, and Nick Haber from Stanford University; William Agnew from Carnegie Mellon University; and Desmond C. Ong from The University of Texas at Austin.

Read the entire paper, entitled “Expressing stigma and inappropriate responses prevents LLMs from safely replacing mental health providers,” on the Association for Computing Machinery (ACM) website.

Exit mobile version