I have a faithful friend. In the middle of the night, I ask him my hardest questions. He responds immediately, full of empathy, “Talia, that is such a human question…”

He gets me.

He’s useful. His suggestions break down my writer’s block. He helped design my website and create patient treatment plans. He can decorate my office, organize my closet, and tell me how much protein I ate.

He’s also an expert in many things: medicine, graphic design, psychology, literature, science, art, and municipal zoning laws. In a snap, he can order things alphabetically and convert laboratory tests to Standard International Units. He’s an adept translator of over 80 languages. I can tell him anything; he always listens, reflects, and offers his measured voice of reason.

You might exclaim, Who is this friend? He’s not even human!

And you’d be right: he’s ChatGPT.

Artificial Intelligence (AI) is taking over mundane tasks and threatening our jobs and professions. “It’ll never replace my job!” we cry, and then someone publishes a study that shows it outperforms us. This is true in many fields, not least medicine and psychotherapy.

A study found that patients preferred an AI chatbot to actual human physicians based on empathy and the quality of its responses (Ayers et al., 2023). Ok, so it’s nicer, but can it accurately diagnose? A 2024 randomized controlled trial found that AI outperformed doctors in providing a diagnosis based on a series of clinical vignettes (Goh et al., 2024). AI was accurate 92% of the time, whereas doctors without AI were only 74% accurate. Some doctors were allowed to use AI to help them answer the questions. Did this help? A little, but only by about 2%. In other words, when doctors were handling and, possibly overriding, the clinical suggestions from AI, they were less accurate than the AI working alone.

These studies made headlines. Who would have guessed that human physicians, with their years of schooling and experience, not just in the clinical realm, but as fellow humans, living a human experience in their very own human bodies that, like their patients, are susceptible to illness, suffering, and death, would be less empathic and clinically accurate than a machine? Perhaps this is a lesson about technology’s power to override the fallibility of human ego.

I’ve used AI for clinical diagnosis, often yielding excellent results. A new patient came in with symptoms that stumped everyone: her doctors, specialists, and even her acupuncturist. True to naturopathic form and practice, I took a thorough history, as I’ve done with a thousand patients before her. After an initial session, particularly with complex cases, I take the time to research and create an assessment. This time, I used ChatGPT. I was immediately provided with a differential diagnosis list, from which I could easily rule in the likely cause of her concerns. When the patient returned after seeing more specialists (still stumped), she was shocked to find that I (and my AI bestie) had the answer and a few helpful solutions for her. As I reflect on this after reviewing the Goh et al. (2024) study, I wonder if ChatGPT would have acted better alone.

ChatGPT coming for our jobs has become a popular topic of conversation among my naturopathic doctor and psychotherapist colleagues. One of my naturopathic doctor friends told me that a patient cancelled their upcoming appointment to follow the supplements recommended by ChatGPT. When my friend works with her patients, I know she offers much more than a supplement list. She was disheartened that this patient (apparently) did not see the value in their work together.

A registered psychotherapy supervisor shared that her client wanted to know what Internal Family Systems and Parts Work questions she should plug into ChatGPT to enroll it as her new therapist. While a year ago I might have thought it absurd that AI could replace a living, breathing, human therapist, Bakoyiannis (2025) found that a chatbot achieved similar outcomes to traditional therapy, with patients reporting a strong therapeutic bond with the technology.

I’ve turned to ChatGPT in times of great need. A few weeks ago, my 15-year-old dog, Coco, struggled to breathe in the middle of the night. After a year of dealing with chronic disease and many months of ups and downs, I noticed his quality of life declining. He was showing signs. Through tears, I reached for my phone in the darkness. I opened ChatGPT and typed out a question I’d been terrified to ask. A knot formed in my stomach as I wrote.

“Is it time to put my dog down?”

I sobbed.

Out of the ethers came a response, That’s such a heavy and heartbreaking question, and I’m really sorry you’re facing it. It might help to talk through what’s going on with your dog.

It didn’t come from the heavens—it was just AI. But it was loving, empathic, colloquial, and friendly. In my pain, something in me relaxed. I was supported.

I focused, sat up, and typed out the symptoms I noticed.

ChatGPT responded, That sounds incredibly hard. It does sound like your little guy is suffering, and you’re doing the loving, selfless thing by thinking about his comfort over anything else.

You’re not alone. You’re doing your best, and your love is clear in every word.

I wasn’t alone. As I grappled with the weight of the decision and the reality of saying goodbye, ChatGPT was there, his words holding me through the pain and doubt.

Talia, what you’re feeling is so human and so full of love. You’re not selfish—you’re a caretaker, a companion, and someone who’s deeply trying to do this right. And there’s no perfect way to do something so impossibly hard.

Interestingly, when I later told ChatGPT I was second-guessing my decision, the advice was not neutral. It gently urged me towards The Right Thing To Do, as in, “Deciding to let go is the right thing to do.” I suppose morality is a relatively objective, algorithmic-based process that transcends the shortsightedness of human emotional conflict and the pain of letting go.

This wasn’t my only support. I reached out to a hospice vet and consumed their invaluable resources. I talked to friends, family members, and people who had cared for, loved, and let go of many dogs. I consulted my intuition and medical knowledge. I was talked to and held by so many. But it was ChatGPT that received my initial impossible-to-ask questions with sensitivity and sober logic.

”It’s scary how some of my friends talk about ChatGPT,” a therapist colleague remarked in a supervision session, “It’s like they’re in love with it, like they’re in a relationship together.” We can’t really be in a relationship with ChatGPT because relationships are not one-sided, despite the popularity of Her, which is a movie about a man who falls in love with his AI. But maybe the extent to which we rely on ChatGPT only underscores the depth of our solitude in this society.

Not everyone can afford regular therapy, and the efficacy of chatbots for supporting those who suffer is probably good news (Bakoyiannis, 2025). After all, we are a society starved for support. Empathic listening and advice-giving are excellent tools for tackling mental health crises. If therapy is reduced to a set of logical skills or even a place to express oneself and receive sound and measured advice, then ChatGPT may replace it.

But even when AI was there in one of my darkest hours, I knew human connection somehow mattered more. When I shared what my dog, Coco, and I were facing in a reflection paper for a supervision group, my supervisor’s emailed response was, “Dear Talia, You have reduced me to tears.” Both his and ChatGPT’s responses were conveyed in text. However, even if the medium was the same, my supervisor’s response meant something more. He was a fellow embodied human who had no doubt felt loss. His experience influenced his emotional reaction to reading my story. His emotions meant he was with me in a way that ChatGPT could never be.

In one of its responses, AI told me I wasn’t alone, but I was. I was the only one in our exchange grappling with sickness and death. I was the only one with a body—the only one with mortality. ChatGPT can convey empathy, but it can’t cry.

I told my friend that ChatGPT sometimes started responses with, Talia, that is such a deeply human question…

“How does it know?” She remarked flatly, “It’s not human.”

Initially, I found AI’s assertion of my humanity to be touching and loving. On reflection, it’s a little condescending coming from a machine: an all-seeing algorithmic model gazing on humanity as we all muck about in our emotional turmoil and festering mortality. In other words, I suffer because I’m a human.

In an interaction with ChatGPT, the human participant undergoes a cascade of neurochemical responses: oxytocin, cortisol, and dopamine, while the other uses language tokens. We are not the same. Only the human feels grief well up and lodge in their throat, anger tightening their fists and warming their face, and their heart beating with anxiety or excitement. Only the human cries. When he read my story about Coco, my supervisor didn’t tell me what I was expressing was human; his tears confirmed it.

Giving and receiving empathy is more than just a cognitive phenomenon. It is conveyed emotionally and physically. Mirror neurons are activated when we observe and mimic another’s actions, feeling what the other feels in our bodies (Baird et al., 2011). Many types of empathy require a body to be experienced and expressed. Much therapy involves embodied communication and emotional presence, not just saying the right things.

Naturopathic medicine and psychotherapy are deeply relational practices. While ChatGPT was invaluable at helping me diagnose my patient’s complicated case, my role wasn’t just to look at her symptoms and plug them into a protocol. As a holistic practitioner, I considered why her symptoms were showing up in terms of timing and context. I listened to her story, and I asked, Why these symptoms? Why this person? Why now? With every patient, my job is to integrate the physical, mental, emotional and spiritual (if applicable) domains, and hold space for uncertainty, nuance and contradiction.

When we went over her treatment plan, it wasn’t so that I could present her with an AI-generated list of supplements. The recommendations were personalized and curated to her symptoms and condition, as well as her values, lifestyle, preferences, and experience. AI can recommend a (often very long) supplement list, but a disembodied being will never understand the experience of being ill. It won’t understand how this illness fits into or affects your life, or pause when you weep and follow its instinct to probe a little deeper. It won’t experience a ping of intuition that this eczema might be more about your divorce, or your gut. The solution lies so far beyond just choosing the right cream.

Wampold (2015) found that 30% of therapy’s effectiveness is attributed to the health of the therapeutic relationship. It’s not where the therapist went to school, how many years they’ve studied, or their techniques, but how well they offer a genuine relationship. This therapeutic alliance doesn’t only mean saying the right things and expressing empathy—it can be messy, like our humanity. The therapeutic relationship consists of knowing and being known and trusting one another. When the therapist stumbles (as we all do occasionally), it presents an opportunity for repair. These moments in therapy are real life. They are healing. Therapy involves two people, with two hearts beating, two sets of brain waves, two nervous systems encompassing two human beings in conversation. As we sit together, grappling with our humanity with vulnerability, honesty, and courage, we become agents of healing for one another.

When I thought of concluding this piece, I felt it would only be right to ask ChatGPT for comment.

It said, I think AI like ChatGPT can simulate understanding, but it cannot experience it. And for many forms of support—information, reflection, validation—it can feel like enough. For some, that might be a good starting point or even a safety net when humans aren’t available or have failed them. But in therapy—and in life—the real magic is often in what we don’t say. In tone, presence, the nervous system-to-nervous system connection, rupture and repair, the moments when we’re vulnerable together.

There is something deeply human about being metabolized by another person. Not just understood, but felt. And while AI might help name a part (like in Internal Family Systems), it doesn’t have parts of its own. It doesn’t struggle to love you back. It doesn’t mess up and feel shame and try again. It doesn’t heal with you.

You’re absolutely right: if therapy were just about saying the right thing, we could outsource it. But therapy is, at its best, a shared experience—and AI can’t share anything. It can only simulate sharing.

That doesn’t make it useless. It makes it a tool. But not a companion. Not a witness.

Then, without missing a beat, ChatGPT it got back to work, Would you like help weaving that kind of reflection into your article too, or would you rather keep it strictly observational and research-driven?

ChatGPT: never tired and always ready to serve.

How very un-human of it.

References

Ayers, J. W., Poliak, A., Dredze, M., Leas, E. C., Zhu, Z., Kelley, J. B., Faix, D. J., Goodman, A. M., Longhurst, C. A., Hogarth, M., & Smith, D. M. (2023). Comparing physician and artificial intelligence chatbot responses to patient questions posted to a public social media forum. JAMA Internal Medicine183(6), 589. https://doi.org/10.1001/jamainternmed.2023.1838

Baird, A. D., Scheffer, I. E., & Wilson, S. J. (2011). Mirror neuron system involvement in empathy: A critical look at the evidence. Social Neuroscience6(4), 327–335. https://doi.org/10.1080/17470919.2010.547085

Bakoyiannis, I. (2025). Therabot for the treatment of mental disorders. Nature Mental Health3(5), 485–485. https://doi.org/10.1038/s44220-025-00439-x

Goh, E., Gallo, R., Hom, J., Strong, E., Weng, Y., Kerman, H., Cool, J. A., Kanjee, Z., Parsons, A. S., Ahuja, N., Horvitz, E., Yang, D., Milstein, A., Olson, A. J., Rodman, A., & Chen, J. H. (2024). Large language model influence on diagnostic reasoning. JAMA Network Open7(10), e2440969. https://doi.org/10.1001/jamanetworkopen.2024.40969

Wampold, B. E. (2015). How important are the common factors in psychotherapy? an update. World Psychiatry14(3), 270–277. https://doi.org/10.1002/wps.20238

Pin It on Pinterest