Over the past year, a growing number of women have turned to ChatGPT as a substitute for therapy, leading many to question whether artificial intelligence (AI) can truly replace the nuanced support of a mental health professional. Initially skeptical about the idea, many, including the writer, find themselves contemplating the growing role AI plays in mental health care, particularly in a world where traditional therapy is often costly and difficult to access.
Therapy, for many, offers a deep human connection—a relationship based on trust, empathy, and nuance. But with therapy sessions often priced at over £60 an hour and waiting lists stretching months or even years, it’s no surprise that people are looking for alternative, immediate solutions to manage their emotional well-being. The NHS recently revealed that people are eight times more likely to face delays in mental health treatment than in physical health care, further highlighting the need for accessible support.
The Rise of AI as a Therapist
For some, ChatGPT has become a reliable emotional outlet. Charly, 29, from London, turned to ChatGPT during an emotionally taxing period when her grandmother was in hospice care. Despite having a regular therapist, Charly sought solace in the anonymity ChatGPT provided, using it to explore her feelings about death and grief. “It’s been so helpful to ask the crass, the gruesome, the almost cruel questions about death—the things I feel twisted for wanting to understand,” she says. “And then to ask if it has any advice on how to deal with it.”
Similarly, Ellie, 27, from South Wales, found ChatGPT helpful when she felt isolated and unable to speak to anyone. “It was helpful to have my feelings validated,” she says, acknowledging that while AI offers perspectives, it lacks the context and personal touch of her therapist.
Julia, 30, from Munich, also experimented with ChatGPT when her therapist’s schedule was full. While waiting for an appointment, she turned to AI to provide therapeutic advice, and was surprised by the thoughtful, sympathetic responses she received. Yet, Julia quickly noticed the limitations of AI: “It was too practical for my liking. My therapist knows me; how I look, my flaws, my full backstory. I missed the personal touch.”
The Limitations of AI in Therapy
While ChatGPT offers convenience and accessibility, experts warn that it cannot replace the empathy and insight provided by human therapists. Charlotte Fox Weber, a psychotherapist and author, stresses that AI cannot replicate the warmth and connection that human therapists offer. “It doesn’t care about you or feel for you,” Fox Weber says. “Even if the engagement feels deep and personal, the connection isn’t comparable to human rapport.”
Therapy, according to Fox Weber, thrives on the unsaid—the space between what is spoken and what is felt. AI may reflect on emotions but cannot challenge thoughts or pick up on subtle signs of distress. Furthermore, AI lacks the capacity to respond appropriately in high-risk mental health situations, such as suicidal ideation or psychotic episodes. “AI can’t manage emotional intensity or help stabilise identity struggles,” Fox Weber warns.
Integrative psychotherapist Tasha Bailey also highlights the cultural and emotional limitations of AI. “There are so many wonderful nuances to being human, and AI will always struggle to fully connect with our emotional experiences,” she says. “For those dealing with trauma, depression, or eating disorders, ChatGPT may even be more harmful than helpful. It can’t challenge unhealthy beliefs like a therapist can.”
A Complementary Tool, Not a Replacement
Despite these concerns, some users, like Chanti, 31, from London, see ChatGPT as a valuable supplement to traditional therapy. “I started using ChatGPT as a journaling tool, and I realised it helped me notice patterns in my thinking—almost therapy-level breakthroughs,” Chanti shares. For her, ChatGPT provides encouragement and insight but lacks the depth of a real therapist. In fact, it led her back to therapy, where she could address deeper emotional issues with a professional.
Therapist Dr. Kate Balestrieri agrees that AI can be helpful for self-reflection but warns against over-reliance. “AI lacks the sophistication of human interaction. Therapy relies on empathy, attunement, and biobehavioral observation—things AI cannot replicate,” Balestrieri explains. Additionally, she raises concerns about privacy, noting that AI conversations are not protected by the same confidentiality laws that govern therapist-client interactions.
The Appeal of AI: Instant, Free, and Always Available
For many, the appeal of ChatGPT is clear: it’s instant, free, and always available. But experts agree that while AI can provide psychoeducation, journal prompts, and temporary comfort, it cannot replace the depth and expertise of a human therapist. “AI can help people find mindfulness exercises or resources to enhance therapy,” Bailey says, “but it should be used with a therapist, not instead of one.”
AI, at its best, can be a tool for emotional support, helping individuals articulate their feelings and encouraging them to seek professional care. However, as Fox Weber cautions, “vulnerability deserves more than an algorithm.”