This question frequently arises during my daily conversations and studies, so I would like to share my thoughts on the topic today. First, let’s consider the reality of the adoption of AI therapy. Mental health chatboxes were favoured by 31% of young adults in the UK over a year ago. Every week, I speak with people who refer to conversations with Perplexity or ChatGPT, which destigmatised their worries about mental health challenges. AI tools have been around for quite some time, but 2025 feels like THE year of their adoption by the general public.

So, is AI therapy helpful?

As a counsellor, I believe it can be helpful at times if used wisely. AI chatbots offer stigma-free conversations without the worry of being judged by another human, and all of that at a fraction of the cost. Tools like Perplexity can crawl the Internet fast to suggest helpful calming, mindfulness and grounding practices. And finally, AI software can offer psychoeducation if we know what to ask for and if we validate the results against professional websites. It is always crucial to remember that AI tools hallucinate, so we can never be sure of the quality of their responses.

When is AI therapy not helpful or potentially harmful?

Let’s remember that AI tools train on our answers and cannot provide nuance; thus, they pose a mirror of our personality, our habits and our speaking patterns. Yes, in a way, it feels like a beautiful reflective conversation with ourselves, with additional information from the Internet. A mirror, if you like. And this is something I would like to draw your attention to. When you come to therapy with a human counsellor, you are meeting ANOTHER person. You are encountering someone imperfect, of course, but also someone different, professionally trained and experienced in human-to-human interactions. You are experiencing and often practising a safe, supportive encounter with another human. That sense of connectedness with a kind human is often the critical factor for healing.

Additionally, let’s also remember the mirror effect of AI tools. We are already widely familiar with the concept of social media echo chambers, putting us in the same virtual space with similarly minded people. Thus, we also know the risks and limitations of such encounters – the validation of our biases and unhealthy or even harmful views and habits. AI chatbots can recreate YOUR subjective version of reality. So when we are feeling low, depressed, anxious or dissociated due to some form of abuse, it may feel quite helpless and suffocating to get stuff in a conversation which reinforces our emotions and attitudes. What we need, then, is another human, one who feels better and is trained to help us out of those darker moments in life.

Finally, for individuals who suffer severe distress or suicidal ideation, such a tool may cause extreme harm, so I would strongly recommend contacting a professional instead.

Please note that AI therapy is not currently regulated, and therefore, these tools have minimal safeguards in place. AI therapy can be helpful if used wisely and at the right, mostly resilient, time in life. However, if you are feeling very unwell or if someone close to you is suffering, please consider accessing trusted, professional, human support.

Image via Canva Pro.


If you need trusted counselling support, book a free consultation now.

Senior social media and digital wellbeing consultant, coach and counsellor. Founder of Voxel Hub.