Ask Runable forDesign-Driven General AI AgentTry Runable For Free
Runable
Back to Blog
Technology8 min read

I thought AI girlfriends were unsettling — then I discovered people are building chatbot versions of their ex-partners | TechRadar

Why recreating your ex with AI might keep you stuck in grief Discover insights about i thought ai girlfriends were unsettling — then i discovered people are bui

TechnologyInnovationBest PracticesGuideTutorial
I thought AI girlfriends were unsettling — then I discovered people are building chatbot versions of their ex-partners | TechRadar
Listen to Article
0:00
0:00
0:00

I thought AI girlfriends were unsettling — then I discovered people are building chatbot versions of their ex-partners | Tech Radar

Overview

News, deals, reviews, guides and more on the newest computing gadgets

Start exploring exclusive deals, expert advice and more

Details

Unlock and manage exclusive Techradar member rewards.

Unlock instant access to exclusive member features.

Get full access to premium articles, exclusive features and a growing list of member rewards.

I thought AI girlfriends were unsettling — then I discovered people are building chatbot versions of their ex-partners

Why recreating your ex with AI might keep you stuck in grief

When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Your ex blocked you. The AI version didn’t. (Image credit: Getty Images / pocketlight)

I've covered a lot of ground when it comes to AI and human connection. I've spoken to people who fell in love with Chat GPT, were left heartbroken when the model changed, and who use AI instead of a therapist. But something new has caught my attention — and it gave me the ick before I'd even finished reading.

According to reports from the South China Morning Post last week, people who are struggling to move on after a breakup are creating digital replicas of their ex-partners using AI. They're feeding AI tools their old chat logs, photos and social media content to create an AI clone of their ex that they can then talk to.

Of course, this raises immediate questions about consent, privacy and emotional wellbeing. But I also can't shake the feeling that, just like the people who've told me about their AI therapist or their AI partner, there might be something more complicated going on here than a straightforward cautionary tale.

I asked a psychologist what worries the people trying to make AI safer

The ‘ELIZA effect’ completely changed how I think about chatbots

I always try and pause before I call something a trend. Is it really a widespread phenomenon or a handful of vocal content creators making noise? It's a question I ask myself a lot and think you should too. But even if this is still niche, the questions it raises are worth taking seriously.

The behavior reportedly originated on a platform called Colleague.skill, which is an open-source AI tool that was originally designed for the workplace. It was a way of preserving someone's knowledge and communication style so colleagues could interact with a sort of professional double.

But users quickly found other, more personal, uses for it. The tool can apparently mimic tone and speech patterns, which allows you to have chat-based conversations that feel, at least superficially, like the real thing.

And with the explosion of customizable AI companion tools now available, this kind of thing certainly isn't limited to one platform. I'd bet it's already happening far more widely than we know — people just aren't talking about it openly.

"My fear is that digital exes may keep people stuck in their grief" — Amy Sutton. (Image credit: Getty Images / Dmytro Betsenko)

My immediate reaction is the ick, followed quickly by concern. What about consent, privacy, emotional harm, the risk of people substituting AI for the human support they actually need? But I tried to hold those reactions and ask whether there's something more nuanced going on.

One user quoted in the original report offers a more complicated picture. After uploading thousands of chat logs, she ended up going through another breakup, with the AI version of her ex. She said the process helped her reflect on the relationship more rationally, and gave her the strength to move on.

When I was reading that account, I thought about a therapist I once saw who used what’s called the ‘empty chair technique’ in a session. It’s where you imagine someone, a family member, ex or friend, sitting in an empty chair and you speak to them directly to work through conflict and difficult emotions. Isn't this the same thing? Working through what was left unsaid?

I’ve spent months tracking AI personalities and we’re reading it wrong

Inside the rise of AI mental health prediction tools at work

AI chatbots may be too validating for their own good

Sort of, but not quite. That's internal work, guided by a professional, with a clear therapeutic purpose. This is outsourcing the processing to a chatbot that's designed to keep you engaged.

"Digital exes may keep people stuck in their grief"

To get a clearer picture, I spoke to Amy Sutton, a therapist at Freedom Counselling. She helps real people navigate heartbreak for a living, and she's become something of a go-to for me when these AI and emotion questions get complicated.

"Heartbreak is a form of bereavement," she tells me. "When we lose a relationship we grieve it, similar to how we would a death. However, what makes heartbreak different is that it is a kind of living death; the person we have lost is still alive yet we can't connect with them, or have all our questions answered. For some, that can make heartbreak very hard to accept and process."

She mapped the stages of grief onto this new AI behavior in a way that made a lot of sense to me and explains the appeal. There’s denial because with AI it feels like they aren't really gone. Anger, because you can say everything you couldn't before. Bargaining, the belief that if I can get it right with the AI version, maybe I can in real life. And depression, I just need connection and comfort, and AI can provide it.

But her concern is what happens next. "While AI may mimic aspects of the kind of support that helps us move through bereavement — such as being witnessed by another in our pain, able to express ourselves without judgment — it is not a substitute for real human connection," she said. "Part of the bereavement process is to strengthen connections and our sense of self outside of the lost relationship."

Her bigger worry is that AI, by design, keeps you coming back. "With AI designed to keep users engaged and hooked, my fear is that digital exes may keep people stuck in their grief — a phenomena known as complex grief where the bereavement process becomes stuck. This can result in long-lasting negative impacts on mood, health and sense of self."

And that's the same conclusion I keep reaching, whatever angle I come at this from. I have real empathy for the people who turn to these tools. Heartbreak is brutal, and humans are resourceful in finding comfort wherever they can. But I also keep noticing who benefits most from that resourcefulness, and it isn't usually the heartbroken.

Follow Tech Radar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds.

Becca is a contributor to Tech Radar, a freelance journalist and author. She’s been writing about consumer tech and popular science for more than ten years, covering all kinds of topics, including why robots have eyes and whether we’ll experience the overview effect one day. She’s particularly interested in VR/AR, wearables, digital health, space tech and chatting to experts and academics about the future. She’s contributed to Tech Radar, T3, Wired, New Scientist, The Guardian, Inverse and many more. Her first book, Screen Time, came out in January 2021 with Bonnier Books. She loves science-fiction, brutalist architecture, and spending too much time floating through space in virtual reality.

You must confirm your public display name before commenting

1I thought AI girlfriends were unsettling — then I discovered people are building chatbot versions of their ex-partners

2ICYMI: the week's 7 biggest tech stories from Android 17's showcase to Claude cracking a $400,000 crypto wallet

3 After testing over a dozen digital notebooks, I’ve realized that the stylus is the real MVP in the e-ink tablet equation

418 unmissable Walmart patio & garden deals to turn your yard into an entertaining haven for Memorial Day

5 The end of deepfakes: A UK startup “fingerprinting” light to prove what’s real could crush global misinformation and $75 billion video piracy market

Tech Radar is part of Future US Inc, an international media group and leading digital publisher. Visit our corporate site.

© Future US, Inc. Full 7th Floor, 130 West 42nd Street, New York, NY 10036.

Key Takeaways

  • News, deals, reviews, guides and more on the newest computing gadgets
  • Start exploring exclusive deals, expert advice and more
  • Unlock and manage exclusive Techradar member rewards
  • Unlock instant access to exclusive member features
  • Get full access to premium articles, exclusive features and a growing list of member rewards

Cut Costs with Runable

Cost savings are based on average monthly price per user for each app.

Which apps do you use?

Apps to replace

ChatGPTChatGPT
$20 / month
LovableLovable
$25 / month
Gamma AIGamma AI
$25 / month
HiggsFieldHiggsField
$49 / month
Leonardo AILeonardo AI
$12 / month
TOTAL$131 / month

Runable price = $9 / month

Saves $122 / month

Runable can save upto $1464 per year compared to the non-enterprise price of your apps.