Ask Runable forDesign-Driven General AI AgentTry Runable For Free
Runable
Back to Blog
Technology10 min read

‘If AI is the only place people feel heard, that’s a societal problem’ — why one charity is pushing back on mental health chatbots | TechRadar

A growing number of people are turning to AI for mental health support due to cost and accessibility — but charities warn it’s risky and are offering safer,...

TechnologyInnovationBest PracticesGuideTutorial
‘If AI is the only place people feel heard, that’s a societal problem’ — why one charity is pushing back on mental health chatbots | TechRadar
Listen to Article
0:00
0:00
0:00

‘If AI is the only place people feel heard, that’s a societal problem’ — why one charity is pushing back on mental health chatbots | Tech Radar

Overview

‘If AI is the only place people feel heard, that’s a societal problem’ — why one charity is pushing back on mental health chatbots

When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Details

As we’ve covered before on Tech Radar, a growing number of people are turning to AI chatbots like Chat GPT and Gemini for something they were never really designed to do: provide mental health support.

According to new research from suicide prevention charity Campaign Against Living Miserably (CALM), around 1 in 4 people in the UK are now using AI tools for mental health advice. Among Gen Z, that figure jumps to 42%, suggesting a generation increasingly comfortable opening up to chatbots rather than people.

On one level, the appeal is obvious: AI is free, instant, and always available . There are no waiting lists, no awkward conversations, and importantly, no cost barrier, at least for the basic versions. But that convenience is masking a more complicated reality.

I asked a psychologist what worries the people trying to make AI safer

Mark Manson on the problem with self-help and why his AI app is different

The UK’s Medicines and Healthcare products Regulatory Agency (MHRA) has already urged caution around using tech for mental health support, and CALM’s findings suggest many people are relying on tools that aren’t designed, regulated, or clinically validated for that role.

At the same time, there’s a deeper issue driving this shift: affordability.

CALM’s research found that Brits are set to spend £2.3 billion(1) on mental health apps in 2026, with some people making difficult trade-offs to access support. Around 23% say they would prioritise paying for these apps over heating their homes, while 28% would choose them over basic necessities like food. I found that shocking.

That’s the gap CALM is trying to address with its new CALMzone app, a free, expert-backed mental health toolkit designed to offer support without the risks or costs associated with many existing options. You can download it for i OS and Android.

Unlike AI chatbots, CALMzone doesn’t attempt to simulate conversation or generate responses on the fly. Instead, it focuses on evidence-based techniques created by mental health professionals, offering structured support for issues like stress, anxiety, and low mood.

The idea is to provide something that sits between doing nothing and needing crisis intervention — a space for what CALM describes as “daily maintenance” for mental health.

“We’re in the midst of a really concerning wave of increasing stigma when it comes to conversations around mental health,” said CALM CEO Simon Gunning. “It’s clear that people need somewhere to turn when things start to bubble over and more immediate, targeted relief is needed”.

Mark Manson on the problem with self-help and why his AI app is different

Appstinence wants to help you break your tech addiction and reject AI companions

For some users, chatbots can feel like a safe, judgment-free space to open up. But they also come with well-documented limitations, including hallucinations, lack of clinical oversight, and the potential for misleading or inappropriate responses.

CALM’s approach is deliberately more cautious. To find out more, and why it’s staying clear of AI for now, I caught up with CALM’s Head of Services, Wendy Robinson, to find out more about their unique approach.

GB: What does CALMzone deliberately not do that some AI-driven mental health tools currently do?

WR: The CALMzone app very intentionally doesn’t pretend to be a human being. The language is very personable and accessible, but when it comes to mental health, having a piece of tech imitating a conversation with a human being isn’t something we can ethically endorse.

There have been lots of stories in the media over the past couple of years that are particularly centred around AI chatbots, and we had the potential to include this functionality in CALMzone. For some people AI chatbots can appear to be a useful resource, but the tech behind AI has evolved far too rapidly for any kind of regulation or clinical practice to safeguard it. In the field of suicide prevention this is a risk we simply can’t afford to take.

AI chatbots for mental health are a risk we can’t afford to take Wendy Roberts, CALM

AI chatbots for mental health are a risk we can’t afford to take

We’ve taken the tried and tested ‘old school’ approach where everything within our app has been created by a human being — not just a human being, but an expert in their field — and we don’t currently enable machine learning to tailor the user’s experience. Instead we monitor anonymous user behaviour and their in-app feedback on activities to tailor their experience and recommendations.

GB: Why was it important to keep CALMzone completely free, rather than offering a premium tier to fund it?

WR: Our guiding principle is that happiness is a right, not a privilege.

As a charity, we wanted to ensure there were no barriers to accessing this wonderful product. Its purpose is to help as many people as possible, not to make money.

The world is so challenging at the moment as people continue to struggle with the rising cost of living, and there’s a direct link between money worries and mental health - yet we’ve identified there’s a staggering cost behind mental wellness.

Our guiding principle is that happiness is a right, not a privilege Wendy Roberts, CALM

Our guiding principle is that happiness is a right, not a privilege

We were in a unique position to address this thanks to the very generous gift from Mind Ease. After eight years of investment and development, they donated the app in its entirety to our charity, which means users are getting a genuinely premium product at no cost.

We’ll be continuing to fundraise to ensure that we don’t ever have to charge users to access this life-saving digital support.

GB: Gen Z are clearly comfortable turning to AI for emotional support — what should governments, charities, and tech companies be learning from that behavior right now?

WR: Turning to AI chatbots for mental health support is indicative of a societal problem, not just an issue with our health infrastructure. Of course extensive waiting lists for mental health support are likely to be a significant reason that people are turning to other more immediate means — investment in these services as part of the NHS 10-year plan is welcome, although it’s a fraction of what the sector actually needs to turn this around — but it also points to a return to the stigma that we’ve worked so hard to remove.

As party politics rage, mental health is often made a scapegoat for a lot of the economic problems the UK is facing. “Just the ups and downs of life” is becoming a slogan for people undermining the experiences of people struggling with anxiety and depression, and our concern is that this stigma is leading to young people especially becoming increasingly insular and isolated when it comes to their mental health.

So what can we learn from this? That people need immediate support, they need opportunities for human connection in seeking this, and we need to return to empathy and support rather than demonising people’s experiences.

GB: If current trends continue, what worries you most about how young people will seek mental health support in five years’ time?

WR: If current trends continue, our biggest concern is that young people lose the opportunity to experience being helped by another human being, and fear being ridiculed for speaking about this openly.

It’s clear that AI continues to have issues when it comes to hallucinations, so unless people are being hyper vigilant in vetting the information they’re presented with, there’s no guarantee of its efficacy or effectiveness. It may even lead to greater harm.

The introduction of paid ads within Chat GPT opens us up to a whole new world of ethical considerations Wendy Robertson, CALM

The introduction of paid ads within Chat GPT opens us up to a whole new world of ethical considerations

And the introduction of paid ads within Chat GPT opens us up to a whole new world of ethical considerations. How will they safeguard people confiding in LLMs about their mental health and prevent them from being marketed to as a result?

Our app won’t solve many of these issues, but we hope that encouraging everyone to regularly maintain good mental health by using it will prevent more people reaching a point of crisis, ensure they are accessing clinically-proven and reliable techniques, and go some way in addressing the growing wave of stigma.

(1) Data from Censuswide shows 41.5% of UK Respondents estimate they will spend on average £6.72 of per month on mental health apps in 2026, multiplied by 12 months equals £80.64 for the year. Extrapolation based on a UK population estimate of 69,487,000 per Office for National Statistics, released November 2025.

Follow Tech Radar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

And of course you can also follow Tech Radar on Tik Tok for news, reviews, unboxings in video form, and get regular updates from us on Whats App too.

➡️ Read our full guide to the best business laptops

  1. Best overall: Dell Precision 5690
  2. Best on a budget: Acer Aspire 5
  3. Best Mac Book: Apple Mac Book Pro 14-inch (M4)

Graham is the Senior Editor for AI at Tech Radar. With over 25 years of experience in both online and print journalism, Graham has worked for various market-leading tech brands including Computeractive, PC Pro, i More, Mac Format, Mac|Life, Maximum PC, and more. He specializes in reporting on everything to do with AI and has appeared on BBC TV shows like BBC One Breakfast and on Radio 4 commenting on the latest trends in tech. Graham has an honors degree in Computer Science and spends his spare time podcasting and blogging.

You must confirm your public display name before commenting

1 This Samsung Galaxy Z Fold 7 rival could have the biggest-ever foldable battery

2 Google Messages is finally getting a big location-sharing upgrade

3 Amazon is shutting down King of Meat, promising a 'full refund' for 'all players'

4 Google Pixel Watch (and other Wear OS users) just got a potentially life-saving new upgrade

5 Want premium PS5 accessories for less? These Nacon controllers, as well as my favorite wireless gaming headset, are $50 off or better at PS Direct and Amazon

Tech Radar is part of Future US Inc, an international media group and leading digital publisher. Visit our corporate site.

© Future US, Inc. Full 7th Floor, 130 West 42nd Street, New York, NY 10036.

Key Takeaways

  • ‘If AI is the only place people feel heard, that’s a societal problem’ — why one charity is pushing back on mental health chatbots

  • When you purchase through links on our site, we may earn an affiliate commission

  • As we’ve covered before on Tech Radar, a growing number of people are turning to AI chatbots like Chat GPT and Gemini for something they were never really designed to do: provide mental health support

  • According to new research from suicide prevention charity Campaign Against Living Miserably (CALM), around 1 in 4 people in the UK are now using AI tools for mental health advice

  • On one level, the appeal is obvious: AI is free, instant, and always available

Cut Costs with Runable

Cost savings are based on average monthly price per user for each app.

Which apps do you use?

Apps to replace

ChatGPTChatGPT
$20 / month
LovableLovable
$25 / month
Gamma AIGamma AI
$25 / month
HiggsFieldHiggsField
$49 / month
Leonardo AILeonardo AI
$12 / month
TOTAL$131 / month

Runable price = $9 / month

Saves $122 / month

Runable can save upto $1464 per year compared to the non-enterprise price of your apps.