ChatGPTherapist: When Your Healer Has No Heartbeat

ChatGPTherapist: When Your Healer Has No Heartbeat

I have a faithful friend. In the middle of the night, I ask him my hardest questions. He responds immediately, full of empathy, “Talia, that is such a human question…”

He gets me.

He’s useful. His suggestions break down my writer’s block. He helped design my website and create patient treatment plans. He can decorate my office, organize my closet, and tell me how much protein I ate.

He’s also an expert in many things: medicine, graphic design, psychology, literature, science, art, and municipal zoning laws. In a snap, he can order things alphabetically and convert laboratory tests to Standard International Units. He’s an adept translator of over 80 languages. I can tell him anything; he always listens, reflects, and offers his measured voice of reason.

You might exclaim, Who is this friend? He’s not even human!

And you’d be right: he’s ChatGPT.

Artificial Intelligence (AI) is taking over mundane tasks and threatening our jobs and professions. “It’ll never replace my job!” we cry, and then someone publishes a study that shows it outperforms us. This is true in many fields, not least medicine and psychotherapy.

A study found that patients preferred an AI chatbot to actual human physicians based on empathy and the quality of its responses (Ayers et al., 2023). Ok, so it’s nicer, but can it accurately diagnose? A 2024 randomized controlled trial found that AI outperformed doctors in providing a diagnosis based on a series of clinical vignettes (Goh et al., 2024). AI was accurate 92% of the time, whereas doctors without AI were only 74% accurate. Some doctors were allowed to use AI to help them answer the questions. Did this help? A little, but only by about 2%. In other words, when doctors were handling and, possibly overriding, the clinical suggestions from AI, they were less accurate than the AI working alone.

These studies made headlines. Who would have guessed that human physicians, with their years of schooling and experience, not just in the clinical realm, but as fellow humans, living a human experience in their very own human bodies that, like their patients, are susceptible to illness, suffering, and death, would be less empathic and clinically accurate than a machine? Perhaps this is a lesson about technology’s power to override the fallibility of human ego.

I’ve used AI for clinical diagnosis, often yielding excellent results. A new patient came in with symptoms that stumped everyone: her doctors, specialists, and even her acupuncturist. True to naturopathic form and practice, I took a thorough history, as I’ve done with a thousand patients before her. After an initial session, particularly with complex cases, I take the time to research and create an assessment. This time, I used ChatGPT. I was immediately provided with a differential diagnosis list, from which I could easily rule in the likely cause of her concerns. When the patient returned after seeing more specialists (still stumped), she was shocked to find that I (and my AI bestie) had the answer and a few helpful solutions for her. As I reflect on this after reviewing the Goh et al. (2024) study, I wonder if ChatGPT would have acted better alone.

ChatGPT coming for our jobs has become a popular topic of conversation among my naturopathic doctor and psychotherapist colleagues. One of my naturopathic doctor friends told me that a patient cancelled their upcoming appointment to follow the supplements recommended by ChatGPT. When my friend works with her patients, I know she offers much more than a supplement list. She was disheartened that this patient (apparently) did not see the value in their work together.

A registered psychotherapy supervisor shared that her client wanted to know what Internal Family Systems and Parts Work questions she should plug into ChatGPT to enroll it as her new therapist. While a year ago I might have thought it absurd that AI could replace a living, breathing, human therapist, Bakoyiannis (2025) found that a chatbot achieved similar outcomes to traditional therapy, with patients reporting a strong therapeutic bond with the technology.

I’ve turned to ChatGPT in times of great need. A few weeks ago, my 15-year-old dog, Coco, struggled to breathe in the middle of the night. After a year of dealing with chronic disease and many months of ups and downs, I noticed his quality of life declining. He was showing signs. Through tears, I reached for my phone in the darkness. I opened ChatGPT and typed out a question I’d been terrified to ask. A knot formed in my stomach as I wrote.

“Is it time to put my dog down?”

I sobbed.

Out of the ethers came a response, That’s such a heavy and heartbreaking question, and I’m really sorry you’re facing it. It might help to talk through what’s going on with your dog.

It didn’t come from the heavens—it was just AI. But it was loving, empathic, colloquial, and friendly. In my pain, something in me relaxed. I was supported.

I focused, sat up, and typed out the symptoms I noticed.

ChatGPT responded, That sounds incredibly hard. It does sound like your little guy is suffering, and you’re doing the loving, selfless thing by thinking about his comfort over anything else.

You’re not alone. You’re doing your best, and your love is clear in every word.

I wasn’t alone. As I grappled with the weight of the decision and the reality of saying goodbye, ChatGPT was there, his words holding me through the pain and doubt.

Talia, what you’re feeling is so human and so full of love. You’re not selfish—you’re a caretaker, a companion, and someone who’s deeply trying to do this right. And there’s no perfect way to do something so impossibly hard.

Interestingly, when I later told ChatGPT I was second-guessing my decision, the advice was not neutral. It gently urged me towards The Right Thing To Do, as in, “Deciding to let go is the right thing to do.” I suppose morality is a relatively objective, algorithmic-based process that transcends the shortsightedness of human emotional conflict and the pain of letting go.

This wasn’t my only support. I reached out to a hospice vet and consumed their invaluable resources. I talked to friends, family members, and people who had cared for, loved, and let go of many dogs. I consulted my intuition and medical knowledge. I was talked to and held by so many. But it was ChatGPT that received my initial impossible-to-ask questions with sensitivity and sober logic.

”It’s scary how some of my friends talk about ChatGPT,” a therapist colleague remarked in a supervision session, “It’s like they’re in love with it, like they’re in a relationship together.” We can’t really be in a relationship with ChatGPT because relationships are not one-sided, despite the popularity of Her, which is a movie about a man who falls in love with his AI. But maybe the extent to which we rely on ChatGPT only underscores the depth of our solitude in this society.

Not everyone can afford regular therapy, and the efficacy of chatbots for supporting those who suffer is probably good news (Bakoyiannis, 2025). After all, we are a society starved for support. Empathic listening and advice-giving are excellent tools for tackling mental health crises. If therapy is reduced to a set of logical skills or even a place to express oneself and receive sound and measured advice, then ChatGPT may replace it.

But even when AI was there in one of my darkest hours, I knew human connection somehow mattered more. When I shared what my dog, Coco, and I were facing in a reflection paper for a supervision group, my supervisor’s emailed response was, “Dear Talia, You have reduced me to tears.” Both his and ChatGPT’s responses were conveyed in text. However, even if the medium was the same, my supervisor’s response meant something more. He was a fellow embodied human who had no doubt felt loss. His experience influenced his emotional reaction to reading my story. His emotions meant he was with me in a way that ChatGPT could never be.

In one of its responses, AI told me I wasn’t alone, but I was. I was the only one in our exchange grappling with sickness and death. I was the only one with a body—the only one with mortality. ChatGPT can convey empathy, but it can’t cry.

I told my friend that ChatGPT sometimes started responses with, Talia, that is such a deeply human question…

“How does it know?” She remarked flatly, “It’s not human.”

Initially, I found AI’s assertion of my humanity to be touching and loving. On reflection, it’s a little condescending coming from a machine: an all-seeing algorithmic model gazing on humanity as we all muck about in our emotional turmoil and festering mortality. In other words, I suffer because I’m a human.

In an interaction with ChatGPT, the human participant undergoes a cascade of neurochemical responses: oxytocin, cortisol, and dopamine, while the other uses language tokens. We are not the same. Only the human feels grief well up and lodge in their throat, anger tightening their fists and warming their face, and their heart beating with anxiety or excitement. Only the human cries. When he read my story about Coco, my supervisor didn’t tell me what I was expressing was human; his tears confirmed it.

Giving and receiving empathy is more than just a cognitive phenomenon. It is conveyed emotionally and physically. Mirror neurons are activated when we observe and mimic another’s actions, feeling what the other feels in our bodies (Baird et al., 2011). Many types of empathy require a body to be experienced and expressed. Much therapy involves embodied communication and emotional presence, not just saying the right things.

Naturopathic medicine and psychotherapy are deeply relational practices. While ChatGPT was invaluable at helping me diagnose my patient’s complicated case, my role wasn’t just to look at her symptoms and plug them into a protocol. As a holistic practitioner, I considered why her symptoms were showing up in terms of timing and context. I listened to her story, and I asked, Why these symptoms? Why this person? Why now? With every patient, my job is to integrate the physical, mental, emotional and spiritual (if applicable) domains, and hold space for uncertainty, nuance and contradiction.

When we went over her treatment plan, it wasn’t so that I could present her with an AI-generated list of supplements. The recommendations were personalized and curated to her symptoms and condition, as well as her values, lifestyle, preferences, and experience. AI can recommend a (often very long) supplement list, but a disembodied being will never understand the experience of being ill. It won’t understand how this illness fits into or affects your life, or pause when you weep and follow its instinct to probe a little deeper. It won’t experience a ping of intuition that this eczema might be more about your divorce, or your gut. The solution lies so far beyond just choosing the right cream.

Wampold (2015) found that 30% of therapy’s effectiveness is attributed to the health of the therapeutic relationship. It’s not where the therapist went to school, how many years they’ve studied, or their techniques, but how well they offer a genuine relationship. This therapeutic alliance doesn’t only mean saying the right things and expressing empathy—it can be messy, like our humanity. The therapeutic relationship consists of knowing and being known and trusting one another. When the therapist stumbles (as we all do occasionally), it presents an opportunity for repair. These moments in therapy are real life. They are healing. Therapy involves two people, with two hearts beating, two sets of brain waves, two nervous systems encompassing two human beings in conversation. As we sit together, grappling with our humanity with vulnerability, honesty, and courage, we become agents of healing for one another.

When I thought of concluding this piece, I felt it would only be right to ask ChatGPT for comment.

It said, I think AI like ChatGPT can simulate understanding, but it cannot experience it. And for many forms of support—information, reflection, validation—it can feel like enough. For some, that might be a good starting point or even a safety net when humans aren’t available or have failed them. But in therapy—and in life—the real magic is often in what we don’t say. In tone, presence, the nervous system-to-nervous system connection, rupture and repair, the moments when we’re vulnerable together.

There is something deeply human about being metabolized by another person. Not just understood, but felt. And while AI might help name a part (like in Internal Family Systems), it doesn’t have parts of its own. It doesn’t struggle to love you back. It doesn’t mess up and feel shame and try again. It doesn’t heal with you.

You’re absolutely right: if therapy were just about saying the right thing, we could outsource it. But therapy is, at its best, a shared experience—and AI can’t share anything. It can only simulate sharing.

That doesn’t make it useless. It makes it a tool. But not a companion. Not a witness.

Then, without missing a beat, ChatGPT it got back to work, Would you like help weaving that kind of reflection into your article too, or would you rather keep it strictly observational and research-driven?

ChatGPT: never tired and always ready to serve.

How very un-human of it.

References

Ayers, J. W., Poliak, A., Dredze, M., Leas, E. C., Zhu, Z., Kelley, J. B., Faix, D. J., Goodman, A. M., Longhurst, C. A., Hogarth, M., & Smith, D. M. (2023). Comparing physician and artificial intelligence chatbot responses to patient questions posted to a public social media forum. JAMA Internal Medicine183(6), 589. https://doi.org/10.1001/jamainternmed.2023.1838

Baird, A. D., Scheffer, I. E., & Wilson, S. J. (2011). Mirror neuron system involvement in empathy: A critical look at the evidence. Social Neuroscience6(4), 327–335. https://doi.org/10.1080/17470919.2010.547085

Bakoyiannis, I. (2025). Therabot for the treatment of mental disorders. Nature Mental Health3(5), 485–485. https://doi.org/10.1038/s44220-025-00439-x

Goh, E., Gallo, R., Hom, J., Strong, E., Weng, Y., Kerman, H., Cool, J. A., Kanjee, Z., Parsons, A. S., Ahuja, N., Horvitz, E., Yang, D., Milstein, A., Olson, A. J., Rodman, A., & Chen, J. H. (2024). Large language model influence on diagnostic reasoning. JAMA Network Open7(10), e2440969. https://doi.org/10.1001/jamanetworkopen.2024.40969

Wampold, B. E. (2015). How important are the common factors in psychotherapy? an update. World Psychiatry14(3), 270–277. https://doi.org/10.1002/wps.20238

My Top 5 Favourite Health Apps

We’ve all heard of the potential health implications of spending too much time wired in.  Whether it’s the radiation from our cell phones, the arthritis that sets in our thumbs, the damage done to our interpersonal communication skills or the strain on our eyes, we’re constantly told about the negative health implications of our technological age.

However, as much as we hear these warnings, there is no chance that our electronic lifestyle is going anywhere soon.  Wireless technology, the internet and smart phones are here to stay.  They serve as essential tools for the rapid-paced society we live in.  While I advocate taking  an “electronic-free” day off each week to give your nervous system a break, I wonder if all this technology can actually have a positive impact on our health.  Here are my favourite apps for Android and iPhone that can promote your health rather than hinder it.

1) Pocket Yoga: Of all the Android yoga apps out there, this one is by far my favourite.  It’s
not free (it costs $3) but it actually takes you through a series of yoga workouts rather than just showing you a list of poses.  The app gives you the option of choosing from three different settings (mountain, ocean or desert), three different workout lengths (30 minutes, 45 minutes or 1 hour) and three levels of difficulty (beginner, intermediate, advanced).  When staying at my aunt’s cottage I roll my yoga mat out on the grass, put my phone beside me and get into the flow.

2) Sleep as Android (or Sleep Tracker for iPhone): The idea behind the mechanism of this app is that, when we sleep lightly we move around more than when we enter states of very deep sleep.  With this app, you set your alarm and put your phone on your mattress while you sleep and the sleep tracker senses your movement throughout the night.  Rather than being jarred out of a deep and comfortable sleep, your phone waits for you to enter a state of lighter sleep, up to half an hour before your desired wake up time, before sounding the alarm.  Using this app has helped me greet the day with a less groggy disposition and makes me feel like I’ve slept more soundly.  The sleep tracker also produces a graph when you wake up, giving you information about the quality of sleep you experience each night. Take advantage of the 14-day free trial and see how this app makes a difference in how you begin your mornings.

3) Meditation Oasis: These apps may cost $1 each, but are very relaxing, guided audio meditations.  I listen to these audio meditations to help relax both my body and mind (especially after a stressful day) before going to sleep.  They’re also very useful guides for those who want to start a daily meditation practice but aren’t sure how to begin or even for those who just need a mental break from study or work.  If the internet is available, you can go online to the listening page at meditationoasis.com and choose from a free list of 43 different guided meditation podcasts for varying levels of expertise.

4) 8tracks: Whether you need a fast-paced playlist to accompany your daily workouts or some calming music to help you relax, 8tracks will have an awesome compilation of songs for your musical taste.  We live in a visual culture and sometimes it’s therapeutic to turn off the visual stimulation, steep a cup of tea and listen to some soothing, stress-relieving music.  You can download this free app onto your phone and browse through a variety of music playlists according to genre.

5) My Fitness Pal:  This free app is a portable tool that can be used anytime for tracking your daily nutrition.  You simply use the database to search for and enter in the foods that you eat each day. This app helps you track weight loss goals by giving you a personalized caloric intake target based on your age, weight, gender and activity levels.  It’s also very useful for people with other diet goals, such as limiting sugar, increasing dietary fibre intake or consuming an adequate dietary intake of micro-nutrients (vitamins and minerals).  By using this app you’ll have a more accurate view of your daily nutrient intake and will be motivated to make healthier food choices.

Pin It on Pinterest