Is AI Therapy Really a Good Idea?

It’s here, so let’s examine the pros, cons and unknowns of outsourcing your counsel to an LLM

A hand reaching through darkness to a large iPhone depicting an AI logo on a backlit screen.
AI is excellent at recognizing patterns — so maybe it could help you kick your worst ones?
Smith Collection/Gado/Getty Images; Cherry Laithang/Unsplash

Think of a period when you were struggling with work, your relationship or just going through a hard time. Maybe loved ones suggested therapy, but it wasn’t a practical solution at the time. You couldn’t commit to regular appointments, afford the co-pay or find someone to take your insurance. The unsuccessful search might’ve even made you feel worse.

But what if you could’ve just asked ChatGPT to give you advice instead? And better yet, in the voice of famous psychologist Carl Rogers? AI has advanced to a point where that is now on the table. Cognitive behavioral therapy (CBT) is a technique that educates people about their thought and behavioral patterns, and specifically how to change those that don’t serve them. Since AI trains algorithms using pattern recognition, “CBT is positioned better for AI than most other forms of therapy,” psychotherapist Amy Morin tells me. 

Morin is also a mental strength coach who is behind the app Mentally Strong. It’s not ChatGPT, but the app, powered by DownPat, uses AI to help people cope with human problems using tools from her book 13 Things Mentally Strong People Don’t Do.

“It can help them identify whether they need to solve the problem or solve how they feel about the problem,” Morin explains. For instance, you might need to set a boundary if you feel overwhelmed in your relationship. But in other cases, reframing your relationship anxieties and utilizing healthier coping skills like meditation, exercise and spending time with friends can help. “The app assists people in deciding which angle to tackle while also giving them support, ideas and feedback.”

Morin acknowledges the limitations of the technology. “AI can’t pick up non-verbal cues, and it can’t glean the emotional nuances in a conversation,” she admits. Consequently, AI-based therapy is not equipped to handle severe mental health conditions, especially if someone’s symptoms threaten the safety of others or themselves. Even in less extreme cases, people might prefer a face-to-face connection. 

What Sort of Therapy Does AI Offer?

Psychotherapist Jordan Conrad has researched the intersection of therapy and technology, and points out that AI can be a form of “stepped care.

Stepped care is a model of psychotherapy which accepts that there is no therapeutic technique that is 100% effective for everyone at all times. “It works by matching a patient to the least costly and invasive treatment relevant to their needs, and then stepping up, as it were, to more intensive treatments as needed,” he explains.

Solutions like self-help books, pamphlets and other educational resources, as well as emerging AI therapy, make up the lower steps of this model.

The Analog Life: 50 Ways to Unplug and Feel Human Again
There’s life beyond the infinite scroll. We put together a toolkit of habits, routines and products to help you live more intentionally.

Combining It With Traditional Therapy

Morin emphasizes that her app isn’t meant to replace therapy. It’s intended to be used alongside therapy with a trained professional.

“The truth is, many people aren’t going to see a therapist,” she acknowledges. “Whether the stigma of talking to therapist is preventing them from doing it or there are financial constraints, therapy isn’t accessible to everyone.” Spending months — if not years — in therapy can be expensive. She believes that AI interventions could assist in the process.

“Not everyone comes to therapy to gain deep insight into the structure of their thinking or to explore why they value the things they do,” Conrad notes. “Many people want to feel better and if AI can help remediate symptoms, they will be happy with that.” If that frees up a psychotherapist who can see someone else with a higher level of need, everybody wins, right?

According to the Health Resources and Services Administration, roughly 122 million Americans live in federally designated mental health professional shortage areas. Additional data indicates that this issue will only worsen. “That means that even people who want to see a therapist can’t,” Conrad says.

Is AI a Threat to Therapists’ Jobs?

Many therapists fear that AI could replace them. Despite AI’s potential for reducing the mental health burden in the U.S., Conrad concedes that this fear is valid.

AI is getting a lot of attention right now, and plenty of powerful people are invested in what it could potentially do. If technology advances to a degree where it matches or surpasses human capabilities, “therapist’s jobs will be the last thing we will consider,” he says. But for now, “there are enough people who are unsure about the tech or who would simply rather have a human taking care of them.”  

The Bottom Line

While Conrad believes AI holds promise for mental health, the technology isn’t regulated to the standards of psychotherapists, who keep diligent medical records. That leaves already vulnerable people at the mercy of tech companies, who could exploit their data.

One promising exception may be the emerging app Rejoyn, the first digital treatment for depression approved by the FDA. A prescription is required, but the platform’s privacy policy states that “[a]ny Health Information that is tied to an Individual’s Personal Information will be treated as Personal Information, provided that any Protected Health Information will be protected in accordance with the requirements of HIPAA, if applicable.”

“That sounds like a step in the right direction, safety-wise,” Conrad says. He encourages all therapists to educate themselves on how to protect their clients from advancing AI. “It’s an exciting time for AI and it is important not to get swept up in it at the expense of patient safety.”

Ultimately, even if AI can theoretically help you deal with your relationship rough patch, the potential of having your personal information violated could pose a greater psychological risk than you’re ready for. So before you confess your deepest, darkest secrets to ChatGPT or a trendy mental health app, try it the old-fashioned way. Pick up a book on motivation, and that may help you stomach the search for a human therapist who takes your insurance. 

The InsideHook Newsletter.

News, advice and insights for the most interesting person in the room.