• Profile
Close

Can AI replace therapists? Some patients think so as they turn to Chat GPT

MDlinx Apr 15, 2023

Reports of patients turning to artificial intelligence (AI) technology—specifically ChatGPT, a free-to-use AI chatbot developed by OpenAI and released last November— as therapy are more and more common. 

In fact, some Reddit users claim that the chatbot is “better than my therapist” while a quick search on Twitter shows that people are already turning to the technology for support. 

This isn’t the first we’ve heard of chatbots being used in the healthcare space. A viral tweet from cardiologist Eric Topol revealed that ChatGPT was able to correctly diagnose a patient—leading to the patient seeking antibody testing and receiving an actual diagnosis. 

So, is ChatGPT the new frontier for mental healthcare—and how do professionals feel about it? As it turns out, the answer is more multifaceted than you may think. 

 

ChatGPT’s main drawback: It’s not human

 

Research in Psychiatriki has shown that the dynamic between therapist and patient contributes to 85 percent of the therapeutic effect, while the actual therapeutic techniques employed contribute to only 15 percent of the outcome.

Kallergis G. [The contribution of the relationship between therapist-patient and the context of the professional relationship]. Psychiatriki. 2019;30(2):165-174.

 

 

Ileana Arganda-Stevens, LMFT, of Thrive Therapy & Counseling in Sacramento, CA believes the human-to-human element is key. 

“We know that the primary determinant of client satisfaction in therapy is the relationship with the therapist. Many people who seek out therapy have felt dehumanized in some way—whether that be through abuse, neglect, oppression, or not being able to be their authentic selves,” she says. “Being validated, seen, accepted, and respected—by another human being—can help us to feel more human. AI [has] no human experience or feelings from which…to offer empathy or compassion to a client,” she says. 

David M. Reiss, MD, a psychiatrist, agrees. “A machine [can never] pick up the nuances of verbal tone, inflection, facial expression, body language, eye contact, overall demeanor, and attitude,” he says. 

Understanding these subtleties, he says, can lead to proper diagnosis and management of a patient. He worries that a ChatGPT diagnosis could fail to capture certain outliers. 

“Every person has personality traits and features that are neither diagnosable nor pathological,” he says. “The subtle information carried by such understanding a person's personality style, and the nature of the psychological defensive structure, can only be very roughly approximated by diagnostic ‘criteria’ or algorithms.”

 

Other drawbacks and concerns

 

Exactly how AI is used in therapeutic settings is also up for discussion—and the border between unethical and helpful is murky at best. According to an article in Psychiatric Times, a free therapy program called Koko faced backlash when the founder, Rob Morris, announced it had run an experiment utilizing an AI chatbot. The chatbot was used to respond to Koko’s users’ mental health queries. 

Critics say this experiment was done without the program’s users knowing they were chatting with AI—prompting questions around when and where the use of AI is acceptable.

Hidden use of chatgpt in online mental health counseling raises ethical concerns. Psychiatrist.com.

 

But even if patients do know they’re chatting with AI, the information they may be receiving could be inaccurate, says Ryan Sultan, MD, director of Integrative Psychiatry and research professor at Columbia University. “Limitations [of ChatGPT] include the increasingly apparent poor information that the systems generate, making proper education of the system from high-quality literature on psychotherapy essential,” he says. 

In fact, an article in Immunology Letters found that when ChatGPT was used to write a medical paper, it “encountered great difficulties when asked for details and references and made many incorrect statements.”

Wittmann J. Science fact vs science fiction: A ChatGPT immunological review experiment gone awry. Immunol Lett. 2023;256-257:42-47.

 

Arganda-Stevens also thinks the use of AI in a therapeutic setting poses another potentially deeper question: “I wonder how our increasing reliance on technology is impacting our emotional well-being and the natural human need for connectedness and belonging,” she says. “What will be the impact of further decreasing human interaction?”

 

Limitations aside, could there be a place for ChatGPT in therapy?

 

Some experts think the technology could be put to use, with the caveat that it shouldn’t be used to replace traditional therapy. 

According to Hampton Lintorn-Catlin, a computer programmer and chief technology officer of Thriveworks, a mental health company, ChatGPT may be best utilized as a simple sounding board.

“If you are dealing with a personal issue or a problem at work, sometimes just the act of having a conversation with someone…. can help you think through your ideas and help clarify your own thoughts on a subject,” Linton-Catlin says. “Having a casual conversation with a non-expert can help bring clarity to your own thoughts. As long as you are looking for answers from yourself, not your AI conversation partner, then it can be helpful.” He notes it can be dangerous to use AI to replace therapy from a qualified expert. 

It may also be capable of more than that, Sultan thinks. He says AI could potentially offer basic triage for new patients, or it could be taught to provide some structured therapies—like cognitive behavior therapy, dialectical behavior therapy, and supportive therapy. Further, he says, ChatGPT may be a pathway to support patients who lack access to mental healthcare.

Outside of the human-to-AI dynamic, ChatGPT could—perhaps ironically—help therapists better communicate with their patients. According to an article in Technology Review, professionals in the mental health space are analyzing AI’s natural-language processing technology (this is what allows ChatGPT to ‘sound’ human in its responses) in the hopes that it might teach therapists how to better express themselves in a back-and-forth conversation with a client.

The therapists using AI to make therapy better. MIT Technology Review.

 

 

Overall, Reiss thinks ChatGPT’s use in therapy could be a mixed bag. “Might some people be helped? Certainly. Are there risks in ChatGPT missing important subtle non-verbal communication? Significantly. Might it water down therapy to simple counseling? Definitely. [It could be] potentially helpful if risks can be mitigated, but even then, that just ain't therapy,” he says. 

For now, AI doesn’t seem to be leaving the chatroom anytime soon. In fact, the conversation around its place in therapy is likely only just beginning. Some researchers believe that digging into its ‘black box’—the way AI works—will pave the way toward its application in healthcare.

Poon AIF, Sung JJY. Opening the black box of AI-Medicine. J Gastroenterol Hepatol. 2021;36(3):581-584.

 

Only Doctors with an M3 India account can read this article. Sign up for free or login with your existing account.
4 reasons why Doctors love M3 India
  • Exclusive Write-ups & Webinars by KOLs

  • Nonloggedininfinity icon
    Daily Quiz by specialty
  • Nonloggedinlock icon
    Paid Market Research Surveys
  • Case discussions, News & Journals' summaries
Sign-up / Log In
x
M3 app logo
Choose easy access to M3 India from your mobile!


M3 instruc arrow
Add M3 India to your Home screen
Tap  Chrome menu  and select "Add to Home screen" to pin the M3 India App to your Home screen
Okay