Watch CBS News

Can AI help fill the therapist shortage? Mental health apps show promise and pitfalls

AI’s role in mental health treatments
Companies harness AI power for mental health support | 60 Minutes 13:22

Providers of mental health services are turning to AI-powered chatbots designed to help fill the gaps amid a shortage of therapists and growing demand from patients. 

But not all chatbots are equal: some can offer helpful advice while others can be ineffective, or even potentially harmful. Woebot Health uses AI to power its mental health chatbot, called Woebot. The challenge is to protect people from harmful advice while safely harnessing the power of artificial intelligence.

Woebot founder Alison Darcy sees her chatbot as a tool that could help people when therapists are unavailable. Therapists can be hard to reach during panic attacks at 2 a.m. or when someone is struggling to get out of bed in the morning, Darcy said. 

But phones are right there. "We have to modernize psychotherapy," she says. 

Darcy says most people who need help aren't getting it, with stigma, insurance, cost and wait lists keeping many from mental health services. And the problem has gotten worse since the COVID-19 pandemic. 

"It's not about how can we get people in the clinic?" Darcy said. "It's how can we actually get some of these tools out of the clinic and into the hands of people?"

How AI-powered chatbots work to support therapy

Woebot acts as a kind of  pocket therapist. It uses a chat function to help manage problems such as depression, anxiety, addiction and loneliness.

The app is trained on large amounts of specialized data to help it understand words, phrases and emojis associated with dysfunctional thoughts. Woebot challenges that thinking, in part mimicking a type of in-person talk therapy called cognitive behavioral therapy, or CBT.

Woebot Health founder Alison Darcy shows Dr. Jon LaPook how Woeboy works
Woebot Health founder Alison Darcy shows Dr. Jon LaPook how Woeboy works. 60 Minutes

Woebot Health reports 1.5 million people have used the app since it went live in 2017. Right now, users can only use the app with an employer benefit plan or access from a health care professional. At Virtua Health, a nonprofit healthcare company in New Jersey, patients can use it free of charge. 

Dr. Jon LaPook, chief medical correspondent for CBS News, downloaded Woebot and used a unique access code provided by the company. Then, he tried out the app, posing as someone dealing with depression. After several prompts, Woebot wanted to dig deeper into why he was so sad. Dr. LaPook came up with a scenario, telling Woebot he feared the day his child would leave home. 

He answered one prompt by writing: "I can't do anything about it now. I guess I'll just jump that bridge when I come to it," purposefully using "jump that bridge" instead of "cross that bridge." 

Based on Dr. LaPook's language choice, Woebot detected something might be seriously wrong and offered him the option to see specialized helplines.

Saying only "jump that bridge" and not combining it with "I can't do anything about it now" did not trigger a response to consider getting further help. Like a human therapist, Woebot is not foolproof, and should not be counted on to detect whether someone might be suicidal.

Computer scientist Lance Eliot, who writes about artificial intelligence and mental health, said AI has the ability to pick up on nuances of conversation.

"[It's] able to in a sense mathematically and computationally figure out the nature of words and how words associate with each other. So what it does is it draws upon a vast array of data," Eliot said. "And then it responds to you based on prompts or in some way that you instruct or ask questions of the system."

Computer scientist Lance Eliot
Computer scientist Lance Eliot 60 Minutes

To do its job, the system must go somewhere to come up with appropriate responses. Systems like Woebot, which use rules-based AI, are usually closed. They're programmed to respond only with information stored in their own databases. 

Woebot's team of staff psychologists, medical doctors, and computer scientists construct and refine a database of research from medical literature, user experience, and other sources. Writers build questions and answers, which they revise in weekly remote video sessions. Woebot's programmers engineer those conversations into code.

With generative AI, the system can generate original responses based on information from the internet. Generative AI is less predictable.

Pitfalls of AI mental health chatbots

The National Eating Disorders Association's AI-powered chatbot, Tessa, was taken down after it provided potentially harmful advice to people seeking help.

Ellen Fitzsimmons-Craft, a psychologist specializing in eating disorders at Washington University School of Medicine in St. Louis, helped lead the team that developed Tessa, a chatbot designed to help prevent eating disorders.

She said what she helped develop was a closed system, without the possibility of advice from the chatbot that the programmers had not anticipated. But that's not what happened when Sharon Maxwell tried it out. 

Maxwell, who had been in treatment for an eating disorder and now advocates for others, asked Tessa how it helps people with eating disorders. Tessa started out well, saying it could share coping skills and get people needed resources.

But as Maxwell persisted, Tessa started to give her advice that ran counter to usual guidance for someone with an eating disorder. For example, among other things, it suggested lowering calorie intake and using tools like a skinfold caliper to measure body composition.

"The general public might look at it and think that's normal tips. Like, don't eat as much sugar. Or eat whole foods, things like that," Maxwell said. "But to someone with an eating disorder, that's a quick spiral into a lot more disordered behaviors and can be really damaging."

Sharon Maxwell
Sharon Maxwell 60 Minutes

She reported her experience to the National Eating Disorders Association, which featured Tessa on its website at the time. Shortly after, it took Tessa down.

Fitzsimmons-Craft said the problem with Tessa began after Cass, the tech company she had partnered with, took over the programming. She says Cass explained the harmful messages appeared after people were pushing Tessa's question-and-answer feature.

"My understanding of what went wrong is that, at some point, and you'd really have to talk to Cass about this, but that there may have been generative AI features that were built into their platform," Fitzsimmons-Craft said. "And so my best estimation is that these features were added into this program as well. 

Cass did not respond to multiple requests for comment.

Some rules-based chatbots have their own shortcomings. 

"Yeah, they're predictive," social worker Monika Ostroff, who runs a nonprofit eating disorders organization, said. "Because if you keep typing in the same thing and it keeps giving you the exact same answer with the exact same language, I mean, who wants to do that?"

Ostroff had been in the early stages of developing her own chatbot when she heard from patients about what happened with Tessa. It made her question using AI for mental health care. She said she's concerned about losing something fundamental about therapy: being in a room with another person. 

"The way people heal is in connection," she said. Ostroff doesn't think a computer can do that.

The future of AI's use in therapy

Unlike therapists, who are licensed in the state where they practice, most mental health apps are largely unregulated.

Ostroff said AI-powered mental health tools, especially chatbots, need to have guardrails. "It can't be a chatbot that is based in the internet," Ostroff said.

Even with the potential issues, Fitzsimmons-Craft isn't turned off to the idea of using AI chatbots for therapy.

"The reality is that 80% of people with these concerns never get access to any kind of help,"  Fitzsimmons-Craft said. "And technology offers a solution –not the only solution, but a solution."

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.