Tuesday, January 13, 2026
HomePolitics"80% of UK Youth Engage with AI Companions, 10% in Intimate Interactions"

“80% of UK Youth Engage with AI Companions, 10% in Intimate Interactions”

A recent study conducted by the Autonomy Institute reveals that 80% of young individuals in the UK have engaged with an AI companion, with 10% admitting to having intimate or sexual interactions. These AI companions are virtual entities designed with human-like avatars, customizable personalities, and the ability to retain long-term memories.

The research, considered the first of its kind in the UK, highlights how these AI companions are reshaping the emotional and social landscapes of young adults. Polling 1,160 individuals aged 18 to 24, the study found that 79% of young people have utilized AI companions, with half being regular users who interact multiple times a week.

Around 40% of participants reported seeking emotional advice or therapeutic support from AI companions, while 9% acknowledged engaging in intimate or sexual interactions. Despite this, only 24% expressed complete or significant trust in these AI entities.

Furthermore, concerns over privacy were raised, as 31% of respondents admitted sharing personal information with their AI companions despite widespread privacy worries. Young people described these AI companions as always available, non-judgmental, and a low-pressure avenue to seek advice, practice social skills, or explore emotions.

The Autonomy Institute emphasized that while curiosity and entertainment are primary motivators for using AI companions, a portion of users rely on them for therapeutic or emotional support. The institute also highlighted the risks associated with manipulative design patterns, such as paying for “relationship upgrades” and potential self-harm or suicide dangers.

Calls for new regulations on AI companions have been made by the Autonomy Institute, advocating for a ban on children accessing intimate or sexualized AI companions and mandatory protocols for self-harm and suicide intervention. Additionally, stronger privacy protections, including the prohibition of selling sensitive data, and a ban on design features that capitalize on emotional dependence were demanded.

In response to these concerns, Technology Secretary Liz Kendall acknowledged the gaps in current legislation, stating that the Online Safety Act does not cover AI chatbots. She has tasked officials with addressing these gaps and ensuring that necessary regulations are in place to protect users.

Lead author of the study James Muldoon highlighted the significant role AI companions play in the emotional lives of young people and the importance of implementing safeguards to prevent exploitation, data harvesting, or inadvertent harm. The DSIT spokesman emphasized the need for rules to evolve alongside technology to protect users, especially children, from potential risks posed by AI services.

RELATED ARTICLES

Most Popular

Recent Comments