London: The UK’s data protection watchdog, the Information Commissioner’s Office (ICO), and communications regulator Ofcom have written to several major social media platforms demanding stronger safeguards for children.
Ofcom has contacted platforms including Facebook, Instagram, Roblox, Snapchat, TikTok, and YouTube, giving them until the end of April to explain what actions they are taking to improve age-verification checks and prevent online grooming.
The regulator has also asked the companies to outline how they are addressing potentially harmful algorithms and how product updates are introduced to users. Ofcom said that it expects an ‘end to product testing on children.’
At the same time, the ICO has written to TikTok, Snapchat, Facebook, Instagram, YouTube, and X asking how their age-verification policies work and whether they effectively protect young users.

The move follows a Conservative-led attempt to introduce a ban preventing children under 16 from using social media. The proposal was rejected in the House of Commons after being voted down by 307 votes to 173. Although ministers initially opposed the idea, the government has since begun consulting on a possible ban but has not committed to supporting it.
Meanwhile, Australia became the first country to implement a nationwide social media ban for children when its policy took effect in December last year.
Age verification rules broken
Research by Ofcom suggests existing minimum-age rules, usually set at 13 years old, are not being properly enforced. The regulator found that 72 percent of children aged eight to 12 are using websites and apps that officially prohibit users of their age.
Ofcom Chief Executive Dame Melanie Dawes criticised major technology firms for failing to prioritise children’s safety. The Chief Executive said that there remains a clear gap between what companies promise privately and the protections they actually implement publicly.

Dawes warned that without effective safeguards such as reliable age-verification checks, children continue to be exposed to risks on online services they cannot realistically avoid. According to Dawes, the situation must improve quickly, or Ofcom will take regulatory action.
ICO Chief Executive Paul Arnold also noted that public concern about children’s safety online is growing and that the current situation is unacceptable. Arnold stressed that companies now have access to modern technology capable of ensuring effective age assurance and therefore have ‘no excuse’ not to implement stronger protections.
Ofcom remarked that it will publish a report in May detailing how the platforms responded to its requests. At the same time, the regulator will release new research examining the impact of the Online Safety Act during its first year, particularly on children’s online experiences.

The regulator added that it is prepared to take enforcement action if the responses from companies are unsatisfactory, including tightening regulatory requirements. Similarly, the ICO highlighted that it has contacted some of the highest-risk services and warned that additional regulatory measures could follow if companies fail to act.
The push for stronger protections has been welcomed by the Molly Rose Foundation, a charity established in memory of a 14-year-old girl who died after viewing harmful content on social media. The organisation stated that the action shows regulators are “turning up the heat on reckless tech firms and their dangerous products which continue to cause daily harm to children.”
Age-appropriate social media
A spokesperson for YouTube noted that the platform has spent more than a decade developing products designed specifically for children and teenagers to ensure age-appropriate, high-quality experiences. The company added that it was surprised by Ofcom’s approach, noting that it regularly updates regulators about its youth safety initiatives.

Meta, the parent company of Facebook and Instagram, said that it has already implemented many of the safety measures being requested. These include using artificial intelligence to estimate users’ ages based on their activity and employing facial age-recognition technology.
Meta also added that teenagers are automatically placed into Teen Accounts, which contain built-in protections that limit who can contact them, restrict the type of content they see, and control how long they spend on the platforms.
A spokesperson for Roblox remarked that the company remains in regular discussions with Ofcom about protecting players and has introduced more than 140 safety features in the past year, including mandatory age checks for access to chat functions.
The company acknowledged that no system is perfect but added that it continues strengthening its protections and looks forward to demonstrating its progress during ongoing discussions with the regulator.

