• Profile
Close

Can an AI bot replace your therapist?

MDlinx Aug 15, 2023

The healthcare industry has adopted artificial intelligence (AI) technology to diagnose disease, monitor patients, and even discover new drugs. 

But how can AI assist in the field of mental health, an endeavor flush with human emotion? Researchers are both hopeful and cautious on how AI can innovate mental health treatment.

 

Managing data with AI

 

In a review published in Current Psychiatry Reports, researchers wrote that mental healthcare has been slow to adopt AI technology but could benefit greatly from wider adoption, especially for parsing large data sets.

Graham S, Depp C, Lee EE, et al. Artificial intelligence for mental health and mental illnesses: An overview. Curr Psychiatry Rep. 2019;21(11):116.

 

  

Mental health practitioners generally rely on soft skills, such as forming relationships with patients and observing their emotions. AI has shown promise in combing through mounds of data to screen and diagnose mental illnesses.

“Leveraging AI techniques offers the ability to develop better pre-diagnosis screening tools and formulate risk models to determine an individual’s predisposition for, or risk of developing, mental illness,” the review authors wrote. “To implement personalized mental healthcare as a long-term goal, we need to harness computational approaches best suited to big data.”

 

Preventing suicide

 

Using AI to churn through large datasets may even help predict and mitigate suicidal behaviors.

In a paper published in the Australian & New Zealand Journal of Psychiatry, researchers used an AI predictive analytics tool to search old datasets.

Fonseka TM, Bhat V, Kennedy SH. The utility of artificial intelligence in suicide risk prediction and the management of suicidal behaviors. Aust N Z J Psychiatry. 2019;53(10):954–964.

 

They found that this tool could help generate a risk algorithm, which could help identify when individuals or populations are in need of emotional support—even emergency assistance. Used proactively, this could potentially prevent suicide. 

“There could be several advantages of incorporating artificial intelligence into suicide care, which includes a time- and resource-effective alternative to clinician-based strategies, adaptability to various settings and demographics, and suitability for use in remote locations with limited access to mental healthcare supports,” the authors wrote. 

 

An AI chatbot as your therapist

 

AI tools in mental healthcare aren’t just for data—there are now AI tools that chat with people about their feelings and problems.

For example, there’s an AI chatbot service called Wysa, which prompts users with questions like “What’s bothering you?” then analyzes their responses.

Noguchi Y. Therapy by chatbot? The promise and challenges in using AI for mental health. NPR. January 19, 2023.

The tool gives users supportive messages and advice—pre-written by psychologists—on managing issues like grief, depression, and chronic pain. 

 

In the 2021 Springer collection Multiple Perspectives on Artificial Intelligence in Healthcare, researchers wrote that chatbots have shown benefits in psycho-education and adherence.

Denecke K, Abd-Alrazaq AA, Househ MS. Artificial Intelligence for Chatbots in Mental Health: Opportunities and Challenges. In: Househ M, Borycki E, Kushniruk A (eds). Multiple Perspectives on Artificial Intelligence in Healthcare. Opportunities and Challenges. Cham, Switzerland: Springer; 2021.

But the authors also warn that chatbots could interfere with the patient-therapist relationship and may cause users to over-rely on chatbots, which have demonstrated limited emotional intelligence. 

 

 

Defending the role of human therapists, authors of a 2021 paper published in SSM Mental Health argued that human interactions, not chatbots, must remain the first line of mental health treatment.

Brown JEH, Halpern J. AI chatbots cannot replace human interactions in the pursuit of more inclusive mental healthcare. SSM - Mental Health. 2021;1:100017.

This is especially true for the most vulnerable populations, they wrote. 

 

Most Americans seem to bristle at the idea of using chatbots for their own mental healthcare.

According to a survey conducted by the Pew Research Center, 71% of people say that they would not use an AI chatbot for their own mental health support.

Tyson A, Pasquini G, Spencer A, et al. 60% of Americans would be uncomfortable with provider relying on AI in their own health care. Pew Research Center. February 22, 2023.

Additionally, 28% disliked the idea so much that they believe that these chatbots shouldn’t be available at all.

 

 

Ethical guidance needed

 

Researchers warn that there must be guard rails to prevent misuse before AI is widely used in mental healthcare.

In an article about the ethical implications of AI in mental healthcare published in the Journal of Medical Internet Research, the authors suggested creating and implementing standard guidelines for AI in this field.

Fiske A, Henningsen P, Buyx A. Your robot therapist will see you now: Ethical implications of embodied artificial intelligence in psychiatry, psychology, and psychotherapy. J Med Internet Res. 2019;21(5):e13216.

 

 

These authors also believed that the use of AI tools in mental healthcare should be transparent, with mental health professionals continuing to oversee the use of AI tools and algorithms. Clinicians also must avoid replacing traditional mental healthcare with robotic tools for those in need, they wrote. 

 

Can AI discern emotions in people’s facial expressions?

 

In addition to AI tools that dig into data and analyze human writing, there are now tools that purportedly recognize emotion in human faces. 

Ellie, a virtual avatar developed at the University of Southern California, uses nonverbal cues to guide the conversation, soothing users with human-like vocal intonations like “Hmmm.”

Royer A. The wellness industry’s risky embrace of AI-driven mental health care. Brookings. October 14, 2021.

 

Kate Crawford, PhD, a research professor at University of Southern California Annenberg and principal researcher at Microsoft Research, wrote in 2021 in Nature that the emotion recognition industry may be worth $37 billion by 2026.

Crawford K. Time to regulate AI that interprets human emotions. Nature. 2021;592(7853):167.

Such projections run high, even though scientists disagree as to whether AI can detect emotions. In Dr. Crawford’s view, these tools need to be regulated to avoid being misused.

 

These safeguards will recentre rigorous science and reject the mythology that internal states are just another data set that can be scraped from our faces.

 

The future of AI in mental healthcare

 

While there’s hope that AI can be of great assistance to mental healthcare, it requires further research. Like with the AI devices that claim to detect emotions in faces, many of the AI tools in the mental health space remain under-studied. 

For example, a 2020 study found that only 2.08% of psychosocial and wellness mobile apps are backed by research.

Lau N, O’Daffer A, Colt S, et al. Android and iPhone mobile apps for psychosocial wellness and stress management: Systematic search in app stores and literature review. JMIR Mhealth Uhealth. 2020;8(5):e17798.

 

With more research, information, and ability will come a greater need for discernment. In an editorial published in the Journal of Mental Health, Sarah Carr, senior fellow in mental health policy at the University of Birmingham, wrote that the industry must work to consider the perspective of patients, caretakers, and families in how AI can best serve their mental healthcare needs.

Carr S. ‘AI gone mental’: Engagement and ethics in data-driven technology for mental health. J Mental Health. 2020;29(2):125–130.

 

“If AI is to be increasingly used in the treatment and care of people with mental health problems then patients, service users and carers should participate as experts in its design, research and development,” Carr wrote.

 

What this means for you

AI in mental healthcare is simultaneously being adopted slowly and developing rapidly. People who can’t afford a therapist may find comfort in talking about their issues to a chatbot, but they need to be aware that its efficacy is still being researched. Mental health clinicians may expect to see AI used more widely in the mental healthcare space, especially for discovering trends and diagnoses via big data. 

 

Only Doctors with an M3 India account can read this article. Sign up for free or login with your existing account.
4 reasons why Doctors love M3 India
  • Exclusive Write-ups & Webinars by KOLs

  • Nonloggedininfinity icon
    Daily Quiz by specialty
  • Nonloggedinlock icon
    Paid Market Research Surveys
  • Case discussions, News & Journals' summaries
Sign-up / Log In
x
M3 app logo
Choose easy access to M3 India from your mobile!


M3 instruc arrow
Add M3 India to your Home screen
Tap  Chrome menu  and select "Add to Home screen" to pin the M3 India App to your Home screen
Okay