Would you trust advice from AI when it comes to your most vulnerable moments?
What If Your Therapist Wasn’t Alone in the Room?
Imagine you’re mid-session with your therapist; opening up, feeling heard; only to learn that your words were quietly passed through ChatGPT. What felt like compassionate, personal insight was actually shaped by an algorithm.
For some clients, this isn’t just a thought experiment. In one case reported by MIT Technology Review, a 25-year-old named Hope messaged her therapist about her dog’s death. The response was tender until she noticed a prompt accidentally left in: “Here’s a more human, heartfelt version…”
Although her therapist later apologized and explained she had used AI only because she had never owned a pet herself, the damage was done. Hope says it altered her perception of her therapist’s empathy and professionalism. The realization shattered her trust.
As AI tools slip into therapy sessions, they raise profound questions. Are therapists enhancing care or outsourcing it? And at what cost to trust?
Why Some Therapists May Turn to ChatGPT
Therapists today face growing caseloads and administrative tasks that can take time away from direct care. AI tools like ChatGPT offer a tempting solution: they can help with writing notes, summarizing sessions, and even simplifying explanations for clients with different communication needs. In these cases, AI works more like a quiet assistant than a replacement. But it’s that invisibility that raises concerns: using AI without telling clients.
When the Curtain Lifts
Therapy rests on trust, honesty, and confidentiality. When a client finds out that AI was used without their knowledge, it can feel like a major breach even if the counsel itself is sound. Simply knowing the words came from a machine can shift how a client understands the therapist’s role and the safety of the space.
A therapist’s responses can help someone process trauma, manage anxiety, and make meaning in painful moments. But when those words come from ChatGPT, even if they sound caring, the relationship changes. AI can’t fully grasp the whole context: a client’s history, culture, nonverbal cues, and the emotions that never quite get spoken out loud. That missing depth matters.
Some clients may begin to wonder whether the warmth they relied on was ever personal or whether it was polished, generic output. For people already navigating trust issues, past harm, this discovery can feel like a deep betrayal.
The Ethical Crossroads
If therapists are quietly outsourcing part of their role, what does that mean for professional ethics? Should clients have a right to know when AI shapes their care? Does it matter if no identifying details are shared?
Consider the stakes: A vulnerable client shares their deepest trauma, only to learn it was processed by an AI system they never consented to. Therapists begin leaning too heavily on ChatGPT, allowing it to steer sessions rather than simply inspire them.
Some argue this secrecy mirrors boundary violations, like secretly recording a session. Others suggest it’s no different from consulting a textbook, provided confidentiality is preserved. The truth may lie in how transparently the tool is used.
Take for instance the experience of Viola’s mother, who lives alone in a small city in eastern China. She used to travel two days just to see a doctor until she began using DeepSeek, China’s leading AI health chatbot. From her couch, she shared symptoms, lab results, and personal health details, sometimes chatting for hours. The bot replied with warmth, encouragement, and tailored advice, becoming what she called her “best health adviser.”
For people who are sick, anxious, or isolated, AI companions can offer comfort, guidance, and constant availability in ways even family cannot. Still, as AI grows more present in our personal lives, it raises difficult questions about how our private data is handled.
OpenAI’s ongoing legal battle with The New York Times underscores this concern: if companies are forced to keep every user interaction, it could weaken the sense of safety these tools depend on. This is especially true if companies have to keep the conversations that users have explicitly chosen to delete. This chips away at the trust that makes them feel so personal in the first place.
(Update: it now looks like at least in this case, OpenAI will not have to retain explicitly deleted chats).
Toward a More Honest Future
What if therapists openly explained when and how they use AI? Informed consent could make clients feel safer and more in control. Some might even welcome the extra support. But without honesty, the presence of AI can quietly damage the trust that therapy depends on.
One thing is clear. Therapy thrives on trust. Without it, progress falters.
If ChatGPT is going to remain part of the therapeutic toolkit, it must enter through the front door with honesty, care, and respect for client autonomy, rather than hiding behind the therapist’s screen. A therapy session is not a ticket refund request: neither easily resolved nor easily forgotten.
__
Did you enjoy this post? If you found it helpful, feel free to share it with your friends! We’d love to hear your thoughts—reach out to us at contact@personalitymax.com. We’re always happy to answer any questions.
Please note that PersonalityMax.com is not a source of financial, legal, or medical guidance.
