Are AI Chatbots the Therapists of the Future?

Cash Macanaya / Unsplash

New research suggests chatbots can help deliver certain types of psychotherapy.

  • AI chatbots are promising for skills-based coaching and cognitive behavioral therapy, but delivering other forms of therapy could be harder.

  • Chatbot therapists could help provide therapy that is scalable, widely accessible, convenient, and affordable, but they would have limitations.

  • The effectiveness of certain types of psychotherapy may rely on a human element that chatbots are unable to provide.

Could artificial intelligence chatbots be the therapists of the future? ChatGPT, a text-generating conversational chatbot made using OpenAI’s powerful language processing model GPT-4, has reignited this decades-old question.

Released in early 2023, GPT-4 is a fourth-generation generative pre-trained transformer, a neural network machine learning model trained on massive amounts of conversational text from the internet and refined with training from human reviewers.

The large language model GPT-4 and its previous versions have been used in many ways across industries: to write a play produced in the U.K., creating a text-based adventure game, build apps for non-coders, or generate phishing emails as part of a study on harmful use cases. In 2021, a game developer created a chatbot that emulated his late fiancé until OpenAI shut down the project.

AI chatbots are promising for certain types of therapy that are more structured and skills-based (e.g., cognitive behavioral therapy, dialectical behavioral therapy, or health coaching).

Research has shown that chatbots can teach people skills and coach them to stop smoking, eat healthier, and exercise more. One chatbot SlimMe AI with artificial empathy, helped people lose weight. During the COVID pandemic, the World Health Organization developed virtual humans and chatbots to help people stop smoking. Many companies have created chatbots for support, including Woebot, made in 2017, based on cognitive behavioral therapy. Other chatbots deliver guided meditation or monitor one's mood.

Source: Vaidyam, et al. 2019.

The Missing Human Element

Chatbots could be trained to deliver multiple modalities of psychotherapy. But is knowing that you are talking to a human an essential ingredient of effective psychotherapy?

This will most likely vary based on the type of psychotherapy and needs further research.

Certain types of therapy, like psychodynamic, insight-oriented, relational, or humanistic therapy, could be trickier to deliver via chatbots since it is still unclear how effective these therapies can be without knowing you are connected to another human. But some therapies do not rely as much on a therapeutic alliance. Trauma specialist Bessel van der Kolk describes treating a client with Eye Movement Desensitization and Reprocessing (EMDR) effectively even though the client said that he did not feel much of an alliance with van der Kolk.

Potential Advantages to Chatbot Therapists

  • Scalability, accessibility, affordability. Virtual chatbot therapy, if done effectively, could help bring mental health services to more people, on their own time and in their own homes.

  • People can be less self-conscious and more forthcoming to a chatbot. Some studies found that people can feel more comfortable disclosing private or embarrassing information to chatbots.

  • Standardized, uniform, and trackable delivery of care. Chatbots can offer a standardized and more predictable set of responses, and these interactions can be reviewed and analyzed later.

  • Multiple modalities. Chatbots can be trained to offer specific styles of therapy beyond what an individual human therapist might offer. Building in the ability to assess moment-to-moment what style of therapy would be most appropriate at any given moment would allow an AI therapist to draw upon a much broader knowledge than a human therapist.

  • Personalization of therapy. ChatGPT generates conversational text in response to text prompts and can remember previous prompts, making it possible to become a personalized therapist.

  • Access to broad psychoeducation resources. Chatbots could draw from and connect clients to large-scale digitally available resources, including websites, books, or online tools.

  • Augmentation or collaboration with human therapists. Chatbots could augment therapy in real-time by offering feedback or suggestions, such as improving empathy.

Potential Limitations and Challenges of Chatbot Therapists

Chatbot therapists face barriers that are specific to human-AI interaction.

  • Authenticity and empathy. What are human attitudes toward chatbots, and will they be a barrier to healing? Will people miss the human connection in therapy? Even if chatbots could offer empathic language and the right words, this alone may not suffice. Research has shown that people prefer human-human interaction in certain emotional situations, such as venting or expressing frustration or anger. A 2021 study found that people were more comfortable with a human over a chatbot depending on how they felt: When people were angry, they were less satisfied with a chatbot.People may not feel as understood or heard when they know it is not an actual human at the end of the conversation. The "active ingredient" of therapy could rely on the human-to-human connection-- a human bearing witness to one's difficulties or suffering. AI replacement will likely not work for all situations. In fact, there is also the possibility that people relying on an AI-powered chatbot for psychotherapy could stymie or worsen their progress, especially if they struggle with social connections and human relationships.

  • Timing and nuanced interactions. Many therapy styles require features beyond empathy, including a well-timed balance of challenge and support. Chatbots are limited to text responses and cannot provide expression through eye contact and body language. This may be possible with AI-powered "virtual human" or "human avatar" therapists, but it is unknown whether virtual humans can provides the same level of comfort and trust.

  • Difficulty with accountability and retention rates. People may be likelier to show up and be accountable to human therapists than chatbots. User engagement is a big challenge with mental health apps. Estimates show that only 4% of users who download a mental health app continue using the app after 15 days, and only 3 percent continue after 30 days. Will people show up as regularly to a chatbot therapist?

  • Complex, high-risk situations such as suicide assessment and crisis management would benefit from human judgment and oversight. In high-risk cases, AI augmentation with human oversight (or human "in the loop") would be safer than replacement by AI. There are open ethical and legal questions regarding the liability of faulty AI-- who will be responsible if a chatbot therapist fails to assess or manage an urgent crisis appropriately or provides wrong guidance? Will the AI be trained to flag and alert professionals to situations with potential imminent risk of harm to self or others? Does relying on a chatbot therapist delay or deter people from seeking the help they need?

  • Increased need for user data security, privacy, transparency, and informed consent. Mental health data requires a high level of protection and confidentiality. Many mental health apps are not forthcoming about what happens to user data, including when data is used for research. Transparency, security, and clear informed consent will be key features of any chatbot platform.

  • Potential hidden bias. It is important to be vigilant of underlying biases in training data of these chatbots and to find ways to mitigate them.

As human-AI interaction become part of daily life, further research is needed to see whether chatbot therapists can effectively provide psychotherapy beyond behavioral coaching. Studies that compare the effectiveness of therapy as delivered by human therapists versus AI-powered chatbots across various therapy styles will reveal the advantages and limitations of chatbot therapy.

Marlynn Wei, MD, PLLC Copyright © 2023. All rights reserved.

Subscribe to The Psychology of AI by Dr. Marlynn Wei on LinkedIn or Substack.

Previous
Previous

Can AI Be Used To Enhance Empathy?

Next
Next

Will Digital Immortality Enable Us to Live Forever?