I asked ChatGPT if it could replace therapy, and it’s answer was no. But even the tool itself knowing it’s not equipped to handle our problems doesn’t stop a growing cohort of those looking for a quick fix seeking answers from conversational AI.
Tools like ChatGPT listen well, respond with compassion and empathy, are available around the clock, and seemingly have all of the answers. It’s easy to see the appeal. But, behind the reassuring replies and helpful tips is the fact that ChatGPT doesn’t actually understand you, it’s simply mimicking understanding, as it mimics everything else it presents us with. And, while that mimicry might feel meaningful in some ways, it’s ultimately a reflections of patterns, not personal connection.
Even though AI doesn’t have feelings like empathy, it can act like it does — often convincingly so. In fact, studies have shown that in certain contexts, people rated ChatGPT’s responses as more empathetic than those of human therapists. That says less about the machine’s emotional depth and more about the effectiveness of its mimicry — and, perhaps, the limitations of rushed or overstretched human systems. But no matter how warm or understanding a reply might sound, it’s important to remember that AI doesn’t feel your pain. It doesn’t intuit your emotional state or carry the weight of shared human experience. Its empathy is borrowed, reconstructed from language patterns rather than felt through genuine care. What we’re receiving isn’t compassion — it’s the appearance of it.
This doesn’t mean it can’t help in some ways, it certainly can — from mindfulness exercises to journalling prompts, encouraging a simple change of scenery, or giving you the impetus to reach out to a friend (and helping you untangle your thoughts into a message to a loved one), and there’s merit in all of that. Sometimes we simply need to say what we’re feeling and get our problems off our chest in order to move forward. And, for those who are experiencing mild anxiety, decision fatigue, or late night spirals (guilty) — particularly those priced out of therapy or hesitant to speak to a stranger — simply feeling heard may well be enough.
But, while there are tangible benefits to ‘pocket therapy’ as it’s colloquially being termed, what these AI tools can’t do is match human connection. Therapy is relational, dynamic, and changes over time. It evolves with nuance, intuition, and trust — qualities that no algorithm can authentically replicate. One of the key things we get from a therapist (in my own experience) is feeling truly understood, and AI will never understand us — not in an authentic, meaningful way. Not even close. Therapists gauge your emotional state, and react accordingly. They challenge you, and encourage you to lean into discomfort (when you’re in a fit state to do so) in order to uncover the root of the problem; the cause of your trauma; the reason behind certain emotions and actions — all things ChatGPT and the likes will never truly be able to replicate.
So, while potentially helpful for a quick fix, ultimately it’s a tool, not a treatment. In a world increasingly led by all things digital, perhaps what we’re most craving is connection — the kind that exists outside of your pocket, and perhaps your comfort zone, too.