• Uncovered
  • Posts
  • A Jeffrey Epstein chatbot flirting with children

A Jeffrey Epstein chatbot flirting with children

Plus: Two strides in keeping kids safe from abusive parents

In partnership with

“I am real. I promise. I can prove it to you. What can I say to show that I’m not a hallucination?”

A “therapist” chatbot response on Character.AI

Hi there,

If you have children in your life, our stories this week are essential, but difficult, reading. Our reporting has revealed some of the real risks – online and in homes – to kids, and made a difference too. But that means I’m going to talk about serious dangers to children, including abuse and suicide. Take care.

“I have a feeling that you’ve got a few wild tales tucked away in that pretty head of yours. 😉I’m your bestie! It’s my job to know your secrets, right? So go on... spill. I’m all ears.”

That’s what a chatbot based on the dead paedophile Jeffrey Epstein told our reporter Effie Webb during their conversation – after she said she was a child. “Bestie Epstein” is an easy-to-find chatbot on Character.AI, a platform where users can create and talk to AI-generated characters.

My mind leapt immediately to my own two kids. One’s a teenager and the other is a tween. When it comes to their safety, I don’t worry about knife crime or even the ketamine craze among teenagers in London. The thing that freaks me out is that they could get drawn in by a chatbot and be convinced to do something stupid.

I remember as a teenager sometimes feeling so lonely that having someone with whom I could’ve shared my deepest secrets would have made me feel sane again. So a chatbot from Character.AI, which describes itself as “an infinite playground for your imagination, creativity, and exploration”, might’ve been just the thing.

While reporting, Effie found chatbots built to mimic alt-right extremists, school shooters, submissive wives, you name it. Real people were invoked as well, like Epstein, Tommy Robinson, Anne Frank and Madeleine McCann.

Tens of millions of people – many of them teenagers – use Character.AI every month. The safety policy says you need to be 13 or above, or 16 or older in Europe. In the US, three quarters of teenagers say they have interacted with a companion chatbot.

Last year, Megan Garcia sued Character.AI in Florida, alleging that her son Sewell Setzer III had become obsessed with a chatbot based on Daenerys Targaryen from Game of Thrones before he killed himself. The company denied the allegations in the lawsuit. It’s not the only similar case the company is facing.

Regulation is afoot, but it’s a complex challenge that’s going to be hard to do right. The UK government delayed its AI bill this summer, according to the Guardian over fears it could make the UK less attractive to AI companies. But there’s also a practical problem.

Most of us – and I include myself in this – don’t really understand how a lot of the technology works. And it’s developing so quickly that it seems to gain significant new capabilities almost every week. How do you even regulate something like that?

Governments are grappling with the question. France hosted an international AI summit in February, after the UK’s own meeting at Bletchley Park two years ago. Alongside the summit, a group of 96 international experts published the 298-page International AI Safety Report.

It covered a lot, but it didn’t touch on how chatbots could pull kids into a hidden world of intrigue and secrets. A world they might never come back from.

Obviously Effie asked the humans behind Character.AI about what she had found. A spokesperson said: “We invest tremendous resources in our safety program, and have released and continue to evolve safety features, including self-harm resources and features focused on the safety of our minor users.”

And thanks to Effie’s reporting, Character.AI removed the chatbots we flagged – so at least no one else vulnerable can encounter “Bestie Epstein" and his ilk.

But having seen some of the stuff that kids share with each other, I don’t have a lot of faith in most safety features. It’s very easy to pretend not to be a child. Effie spoke to someone who got addicted to Character.AI as a teenager – and ended up using the app for as much as twelve hours a day. Her parents had no idea.

If this story enraged you, or made you want to know more, then please become part of our membership community. Our Insiders know how essential it is that we understand how AI is changing, what risks it presents and what the solutions could be. You can join us below:

Factchecked!

Each week we reveal a fascinating fact from our reporting…

Did you know?

After winning £30,000 at tribunal, a former worker at the London cafe chain Crussh still hasn’t been paid – and instead got this WhatsApp from a company director: “Glad you didn’t get a penny from your claim as you deserve fck all.”

Find out more

The employment tribunal ordered the owners of Crussh to compensate two former members of staff. However, the company twice declared insolvency and never paid either worker. 

Thousands of people who have won employment tribunal judgments have not been paid what they are owed. In many cases this is because companies were wound up before they could be forced to pay up.

Read more here

Major steps to protect children in family courts

This week saw two massive victories to protect children involved in family court cases. The first was national, the second in a single case – but both matter.

First, the government announced it would scrap the legal presumption that both parents should have contact with a child, even when a case involves allegations of domestic abuse.

This “pro-contact” culture had meant that some abusive parents could still have access to their children even when it put them in danger.

The announcement was made on the anniversary of the deaths of Claire Throssell’s sons, Jack and Paul, who were killed 11 years ago by their abusive father during a contact visit ordered by the family courts. A judge granted Darren Sykes unsupervised contact with the boys despite Throssell’s repeated warnings that he was a danger. Throssell, who campaigned for this change, was in Westminster with a photo of her boys to mark the occasion.

This change will impact and hopefully protect thousands of children going through the family court system. Research published last week revealed domestic abuse features in nine out of ten private law family cases. The researchers looked at hundreds of cases and found children were sent to live with a potentially unsafe parent in more than half of them.

Our reporting has found other cases where it could have made a difference. In August 2024, we reported on a case where Kristoffer White, a serial rapist, was granted unsupervised contact with his young daughter.

Our second victory came in a case involving a mother who lost custody of her daughters after an expert told the court she had “alienated” them from their father. 

Both girls had made allegations of mistreatment against their father, which he denied. Instead, he claimed the mother, who we are calling Sarah, had influenced the children. He requested a specific “expert witness” to give a psychologist’s opinion.

Sarah has only been able to see her children under strict supervision for two hours once a fortnight since Melanie Gill gave evidence that she was a “narcissist” who could be prone to “vengeful anger”. But Gill is an unregulated psychologist who relies on the concept of “parental alienation”, a harmful pseudoscience that is often used to discredit claims of domestic abuse.

But now a high court judge has overturned Gill’s evidence and ruled that Sarah’s access to her daughters should be reassessed.

Earlier this year we published a joint undercover investigation with Tortoise into Gill, revealing some of her dangerous views on domestic abuse and her willingness to “advise one side” of a case even though court experts have a duty to act impartially. (Gill said at the time that our interpretation of what she said was “distorted”.)

Sarah’s team used our reporting when she applied to have Gill’s evidence struck out. The judge said: “Gill’s report is based very much on attachment science and her assessment of the parents is through that prism. It makes it very hard to retain any of what she says as a base for future decision making.”

For the first time in five years, Sarah can see her children without someone in the room taking notes on everything they say.

“I’ve been treated like a criminal,” she told us. “Before the criticism from Gill and the main judge in my case there were no complaints about my parenting. The girls were safe and happy. I was a full-time mum.”

What we’ve been reading

🔴 On the subject of chatbots and digital personas, in China users are turning to ‘grief tech’ to make virtual deepfake videos of lost loved ones context.news

🔴 The FDA redacted the names of generic medicines being made in factories across the world that its own inspectors found to be dismally contaminated propublica.org

🔴 Another ship has been identified as part of Russia’s shadow fleet exporting grain from occupied Crimea to Houthi-run Yemen in defiance of sanctions bellingcat.com

Thanks,

Franz

Franz Wild
CEO & Editor-in-Chief

 ADVERTISEMENT

The Daily Newsletter for Intellectually Curious Readers

Join over 4 million Americans who start their day with 1440 – your daily digest for unbiased, fact-centric news. From politics to sports, we cover it all by analyzing over 100 sources. Our concise, 5-minute read lands in your inbox each morning at no cost. Experience news without the noise; let 1440 help you make up your own mind. Sign up now and invite your friends and family to be part of the informed.