Therapists Should Ask Patients About Their AI Use, New Paper Advises

It's part of a growing consensus

Hands typing on a laptop keyboard

The paper advises therapists to better understand their patients' AI use.

By Tobias Carroll

Earlier this year, a New York Times investigation by Jennifer Valentino-DeVries and Kashmir Hill uncovered something disquieting about the current state of mental health: a number of therapists reporting that their patients were experiencing delusional behavior as a result of their interactions with AI chatbots. Mental health professionals are not the only people witnessing this: WIRED recently chronicled legal efforts to rein in chatbots from encouraging psychologically troubling behavior.

With AI as a growing presence in many people’s lives, it isn’t strange to think that that is something to consider when treating mental health issues. And now, a paper published earlier this month in the journal JAMA Psychiatry makes that case more formally — making the case that “such conversations are essential.”

As the authors explain, they opted to focus on why therapists should ask their patients about how they are using AI as opposed to other elements of patients’ use of AI. “While professional guidance focuses on how clinicians should use AI tools, conversations with patients about their AI use receive less attention,” they write.

In an interview with NPR’s Rhitu Chatterjee, one of the paper’s authors — NYU’s Shaddy K. Saba — made the case that a nonjudgmental approach to AI use was important. Dr. Saba compared a therapist asking their patient about AI use to a therapist asking their patient about their use of different substances. And that seems logical — a patient in a relationship with a chatbot and a patient who has never used AI exhibiting similar symptoms might require very different approaches to treatment.

I’m Dating an AI Chatbot. My Girlfriend Is Jealous.
AI dating is a new step in the endless search for digital companionship. I figured my human girlfriend wouldn’t mind if I gave it a shot.

As NPR’s reporting on the new paper points out, its recommendations are not far removed from a more formal set of guidelines released last year. In November 2025, the American Psychological Association released a set of guidelines related to both AI chatbots and health applications in general. That report argued that chatbots were not a substitute for working with a therapist, but that they “may be appropriate as a supportive adjunct, not substitute, to an ongoing therapeutic relationship.” Whatever form it takes, it’s good to see a growing number of professionals exploring the myriad ways this technology intersects with mental health.

Exit mobile version