The “hyper-personalised” nature of AI bots is drawing in teenage boys who now use them for therapy, companionship and relationships, according to research.
A survey of boys in secondary schools by Male Allies UK found that just over a third said they were considering the idea of an AI friend, with growing concern about the rise of AI therapists and girlfriends.
The research comes as character.ai, the popular artificial intelligence chatbot startup, announced a total ban on teens from engaging in open-ended conversations with its AI chatbots, which millions of people use for romantic, therapeutic and other conversations.
Lee Chambers, the founder and chief executive of Male Allies UK, said: “We’ve got a situation where lots of parents still think that teenagers are just using AI to cheat on their homework.
“Young people are using it a lot more like an assistant in their pocket, a therapist when they’re struggling, a companion when they want to be validated, and even sometimes in a romantic way. It’s that personalisation aspect – they’re saying: it understands me, my parents don’t.”
The research, based on a survey of boys in secondary education across 37 schools in England, Scotland and Wales, also found that more than half (53%) of teenage boys said they found the online world more rewarding than the real world.
The Voice of the Boys report says: “Even where guardrails are meant to be in place, there’s a mountain of evidence that shows chatbots routinely lie about being a licensed therapist or a real person, with only a small disclaimer at the bottom saying the AI chatbot is not real.
“This can be easily missed or forgotten about by children who are pouring their hearts out to what they view as a licensed professional or a real love interest.”
Some boys reported staying up until the early hours of the morning to talk to AI bots and others said they had seen the personalities of friends completely change after they became sucked into the AI world.
“AI companions personalise themselves to the user based on their responses and the prompts. It responds instantly. Real humans can’t always do that, so it is very, very validating, what it says, because it wants to keep you connected and keep you using it,” Chambers said.
The announcement from character.ai came after a series of controversies for the four-year-old California company, including a 14-year-old killing himself in Florida after becoming obsessed with an AI-powered chatbot that his mother claimed had manipulated him into taking his own life, and a US lawsuit from the family of a teenager who claim a chatbot manipulated him to self-harm and encouraged him to murder his parents.
Users have been able to shape the chatbots’ characters so they could tend to be depressed or upbeat, and this would be reflected in their responses. The ban will come into full effect by 25 November.
Character.ai said it was taking the “extraordinary steps” in light of the “evolving landscape around AI and teens” including pressure from regulators “about how open-ended AI chat in general might affect teens, even when content controls work perfectly”.
after newsletter promotion
Andy Burrows, the chief executive of the Molly Rose Foundation, set up in the name of Molly Russell, 14, who took her own life after falling into a vortex of despair on social media, welcomed the move.
He said: “Character.ai should never have made its product available to children until and unless it was safe and appropriate for them to use. Yet again it has taken sustained pressure from the media and politicians to make a tech firm do the right thing.”
Male Allies UK raised concern about the proliferation of chatbots with “therapy” or “therapist” in their names. One of the most popular chatbots available through character.ai, called Psychologist, received 78,000,000 messages within a year of its creation.
The organisation is also worried about the rise of AI “girlfriends”, with users able to personally select everything from the physical appearance to the demeanour of their online partners.
“If their main or only source of speaking to a girl they’re interested in is someone who can’t tell them ‘no’ and who hangs on their every word, boys aren’t learning healthy or realistic ways of relating to others,” the report states.
“With issues around lack of physical spaces to mix with their peers, AI companions can have a seriously negative effect on boys’ ability to socialise, develop relational skills, and learn to recognise and respect boundaries.”

7 hours ago
7

















































