Why AI can never Replace Real World Therapy or Coaching
Nov 14, 2025
And why authentic healing requires a human heart...
Over the past year, I have seen an increasing number of my clients turning to AI and ChatGPT to seek support and advice for the struggles they are experiencing. In times of frustration, our natural instinct is to turn to the closest available support - so why wouldn’t we be drawn to the immediate, comforting and often very wise advice of AI? I have also used it myself - in recent times of struggle, I’ve found myself pulling tarot cards and asking ChatGPT to interpret and help me make sense of the experiences I’m going through. Using both ancient methods of interpretation and modern-day technology has been incredibly reassuring and often mind-blowingly accurate. So, I respect and relate deeply to our human desire to seek comfort, in a way we might text or call a friend, or turn to our journals or a reassuring book for guidance. AI adds an extra layer of reassurance by giving us ‘human-like’ immediate feedback and advice.
I’ve been reflecting on the power of these tools and how much comfort they can offer people in times of despair, and they might even serve as a lasting resource for those who are experiencing deep despair and isolation. However, I also have concerns about our increasing reliance on such technology and worry that many of us may begin to turn to it as our sole and only way of seeking connection. From an ethical standpoint, I feel a responsibility to caution my clients on depending too heavily on AI to solve their dilemmas, as this reliance risks reducing the depth of our human experience to something transactional and mechanistic.
Limitations: Reinforcing CBT Patterns and Rumination
The first issue I see with people using AI for therapy or coaching is that it often steers them towards more cognitive-behavioural approaches, which focus primarily on conscious thoughts, beliefs, and behaviours. While these approaches can be useful, they are inherently limited because they tend to work within the very same cognitive frameworks that created the problems in the first place. Although AI is often highly effective at offering prompts or reframing questions that encourage new perspectives, it does not necessarily support a genuine shift in one’s state, nor does it engage with the embodied and visceral aspects of a person’s emotional experience. As a result, while it may provide useful insights, it does not necessarily facilitate deeper integration at the level of body, mind and psyche.
AI can be great at offering immediate, comforting advice, and although these seemingly innocent suggestions, for example, on how to respond to conflict, can be incredibly reassuring, what I’ve observed is that it is leading to further disconnect from the body and our intuition. In some, it reinforces further rumination, analysis, and the need to seek answers outside of oneself. The more often individuals turn to AI for guidance in relationships or workplace conflicts, the less opportunity they have to develop the discernment and inner resilience needed to navigate situations independently, which can reduce one’s confidence, intuition, and emotional capacities.

What AI Cannot Replicate
In transpersonal coaching, we offer a client-centred approach that fundamentally differs from the advice-driven model AI typically uses. As coaches, we intentionally avoid giving direct solutions, and even though this can sometimes feel frustrating for clients and may involve a longer process, it is much more effective in long-term healing and transformation. The slower unfolding allows for integration at a deeper level, and it’s my experience that more embodied and lasting change occurs when insights arise from the client’s own awareness and Will rather than from an external source offering quick solutions.
There are also certain core aspects of the therapeutic or coaching space that AI cannot replicate. One of the most significant is the creation of a held space, or what I sometimes refer to, within transpersonal coaching sessions, as the alchemical container. This is a liminal, open, and receptive environment that fosters connection to something greater than the individual and can be understood as a transpersonal knowing or an ability to connect to insights that exist within the collective unconscious or a higher consciousness. Within this container, clients are also invited to explore their inner worlds - including emotions, beliefs, and spiritual experiences in a way that feels safe, supported, and nonjudgmental.
In transpersonal coaching, we also acknowledge the interconnectedness of mind, body, spirit, and soul, and we understand that this liminal space plays a central role in the client’s movement toward healing and wholeness. The healing presence of the coach is essential here. It involves intuition, deep attunement, and a way of being with the client that cannot be reduced to technique. This empathic resonance is felt, not merely understood intellectually, and it forms the foundation for genuine transformation.
This relates closely to the psychodynamic concept of ‘the third’, which describes the emergent relational space that arises through authentic interaction between client and practitioner. It is in this co-created space that much of the transformative potential of therapy and coaching resides. Change happens not only through what each person brings, but also through what they create together.
Co-Regulation
Therapeutic and coaching sessions are also spaces of co-regulation, where the nervous systems of two or more individuals interact in subtle yet powerful ways that bring emotional states into balance. This process involves facial expressions, tone of voice, body language, eye contact, and even shared rhythms of breath, all of which occur largely outside conscious awareness.
Within this context, the practitioner’s regulated presence acts as a stabilising force, allowing clients to feel emotionally safe even when exploring challenging material. Although AI can simulate empathic language, genuine attunement involves layers of timing, tone, pacing, warmth, and presence that cannot be authentically replicated. AI may mimic the surface of empathy through well-phrased text but it does not feel or respond emotionally. The subtle, moment-to-moment adjustments that make co-regulation so powerful are absent.
AI lacks a body, a nervous system, and the capacity for somatic attunement, all of which are essential for supporting clients in becoming more embodied. Without the shared relational field, real-time sensory feedback, and embodied presence of a human practitioner, clients are more likely to remain in their heads, miss bodily cues, or drift away from their felt experience. While AI can offer structured grounding exercises or reminders to check in with bodily sensations, it cannot sense, track, or co-regulate. These capacities are integral to the deep, embodied work that makes therapy and coaching so transformative.
Privacy and Security
Another concern that comes up around using AI for therapy or coaching is that of privacy and confidentiality. Whatever is shared through AI platforms doesn’t necessarily adhere to the same ethical or legal standards that a qualified therapist or coach would follow when it comes to storing, processing, or protecting sensitive information. Unlike human practitioners, AI systems are typically governed by data policies rather than professional ethical codes, which means users may have little control or understanding over how their data is stored, used for training models or potentially shared with third parties.
In therapy or coaching, practitioners are held accountable to specific professional bodies and ethical frameworks, such as the BACP, ICF or IACTM, which include clear guidelines around consent and boundaries. If a client is suspected of being in danger, at risk of harm, or harming others, the practitioner has an ethical duty of care to take appropriate action. When it comes to AI platforms, no such accountability, safeguarding or duty of care exists - the system may flag distressing content, but there are no humans to intervene in a meaningful way.
There is also the question of emotional safety and attunement. A human therapist or coach can detect subtle cues such as tone, body language and shifts in emotion and subtle nuances which, as an example, might be necessary if a client is in danger of self-harm or harm to others. AI systems lack that embodied awareness and cannot offer relational presence or emotional co-regulation, which are often central to healing and growth. Although AI can simulate therapeutic dialogue, it does not have genuine understanding or moral responsibility, and this raises a deeper concern - even if AI produces comforting or insightful responses, it cannot truly hold the client in the way a human can.
So, it is clear that AI tools may play a supportive role in one’s healing or therapeutic journey, in a similar way to journaling, self-help books, or guided meditations. They can provide a space for reflection and help individuals feel less alone during times of struggle. However, it is my belief that they can never replace the co-regulation, empathy, and human presence that come from real interpersonal connection.
It’s also important to remember that AI tools such as ChatGPT, while often offering what appears to be insightful or wise advice, do not operate within the same moral codes, ethical frameworks, or professional accountability that guide human practitioners. They also can’t perceive the nuances of body language, tone, or subtle emotional shifts that therapists and coaches are trained to recognise and respond to. There is also a risk that reliance on AI may keep individuals locked in a cognitive or intellectual mode of processing, further disconnecting us from embodied and intuitive ways of knowing that are essential for genuine healing. My concern is that too many people may become dependent on AI for quick answers or solutions, rather than cultivating trust in their own inner wisdom and self-awareness.
