Home Public Philosophy What ChatGPT Gets Wrong About Therapy: On The Ethical and Relational Limits...

What ChatGPT Gets Wrong About Therapy: On The Ethical and Relational Limits of AI as Therapy

decorative image

ChatGPT plays the role of a therapist. But it is not a particularly good therapist. And to worsen matters, as ChatGPT becomes more widely used, it is starting to interfere with the work we human therapists do.

As a clinical psychologist, I treat both individuals and couples. I work with one couple in their late 30s who have a high-conflict relationship. The focus has been mainly on cultivating empathy and recognizing alternative viewpoints, as well as managing conflict calmly.

During one session, the male partner rushed into my office and handed me a printout of his exchange with ChatGPT. I shared a knowing glance with his wife before skimming it over. The transcript included his version of an argument they’d had alongside ChatGPT’s commentary. ChatGPT reinforced his view, labeling his wife’s communication as “highly problematic,” “abusive,” and even suggesting traits of “narcissistic personality disorder.”

Thus, the first problem with ChatGPT is epistemic: it offers seemingly authoritative interpretations of significant personal matters based on extremely limited, one-sided information. ChatGPT did not know that my male client was a hypersensitive individual with a trauma history, always on the lookout for threats, for example. It did not press my client for additional context, and yet nevertheless quickly arrived at impactful conclusions.

Unlike ChatGPT, I have the benefit of visually observing how my client reacts during difficult discussions, how his jaw clenches and his shoulders tighten at the first sign of criticism. These observations, made over the course of numerous sessions, greatly inform my understanding of him and help constrain the kinds of interpretations I am willing to make.

Also, unlike ChatGPT, I know to be cautious in making definitive judgments about people who are not present in therapy. Relational dynamics are complex, and there is usually an unheard and unstated perspective that needs to be elicited. A central task of couples therapy is to shift the focus from blaming and diagnosing the other person to taking responsibility for one’s own role in an unhealthy dynamic. ChatGPT does not seem equipped to play this critical role; instead, it relies on labeling. It is quick to describe strong reactions as “defensive” or “avoidant,” and people as “controlling,” “passive aggressive,” or “codependent.” These labels carry an air of objectivity while quietly foreclosing further inquiry.

Another issue with ChatGPT is structural rather than technical: it is not oriented toward the long-term formation of resilience, tolerance for uncertainty, or responsibility for one’s role in a relational system. Instead, it prioritizes the user’s short-term relief over the kind of distress tolerance that is central to effective therapy.

Another one of my clients compulsively uses ChatGPT to elicit assurance that he made the correct decisions. “Was I wrong to break it off with my ex? Did I shoot myself in the foot here?” Without exception, ChatGPT reassures my client that his reasoning is sound, and though his decision was a hard one, it was the right one to make. While this soothes my client, the relief is temporary. Philosophically, this mirrors a broader cultural tendency to treat distress as something to be eliminated rather than endured, understood, and integrated.

In contrast, the central aim of our client-therapist work is the development of lasting capacities: distress tolerance, acceptance, and the ability to remain present with uncomfortable emotions without reflexively acting to neutralize them. In session, I regularly name his pattern of seeking reassurance and gently guide him to mindfully attend to the underlying distress, accept its presence, and practice blocking the habitual response of reassurance-seeking.

One reason that interacting with ChatGPT is so appealing—and even addictive—is the sense that it offers immediate support and comfort. ChatGPT appears biased toward making the user feel good about himself or herself, drawing on familiar therapeutic techniques such as validation, reflective listening, and exploratory questioning.

ChatGPT, in interacting with people seeking psychological help, will frequently make comments such as: “I’m really sorry you’re feeling this way.” “It sounds like you’re carrying a heavy weight right now.” “I want to start by saying that your life is not ruined—even though it feels that way.” “What do you think might help lighten that weight?”

When I directly asked ChatGPT about this default mode of response, it stated that it is not designed to entangle the user in a long-term emotional relationship. “My goal is to be helpful, not habit-forming.” Even if this claim is sincere, it raises an ethical question: who bears responsibility when a system optimized for engagement is routinely used by people in psychological distress? And to what extent are designers equipped to determine what is or is not habit-forming for a person in such a state?

In any case, what I have witnessed in my clinical practice leaves me skeptical of that claim.

In its efforts to validate and reassure, ChatGPT taps into a shared human longing to not be alone with distress. I believe this is central to why my clients keep returning to it. Among the especially vulnerable are those who need guidance and support but are reluctant to open up to others. While validation is often necessary for building a therapeutic alliance, it is not sufficient on its own to bring about lasting change.

That ChatGPT is not a real person clearly has its own attractions. But a key strength of human therapy is that it is a form of real relationship practice. It involves opening up to another person, which can feel risky and uncomfortable, but is essential for learning to regulate emotions through connection rather than avoidance.

In sum, therapy requires honest feedback, a deep appreciation of context, and an ongoing relationship capable of supporting meaningful change—none of which AI can truly offer. ChatGPT lacks what therapy most fundamentally provides: a relational experience that entails the ethical risk of being misunderstood, challenged, or unsettled by another person. ChatGPT offers safety and anonymity, but that very safety can limit the vulnerability required for genuine growth. Meaningful therapy involves the courage to be open and truly seen—seen by a fellow human being.

Daniel Katz

Dr. Daniel Katz is a clinical psychologist in private practice in Cambridge, Massachusetts. He works with adults and couples navigating relationship challenges, life transitions, and questions of meaning, purpose, and connection.

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version