London: UK media regulator Ofcom has unveiled the final version of its children’s codes under the Online Safety Act, laying out strict new requirements for digital platforms to protect young users from harmful online content.
The new regulations, which will take effect on 25 July, require social media, search, and gaming services to carry out risk assessments, filter out harmful content using algorithms, verify users’ ages, respond quickly to flagged issues, and make it easier for children to report abusive or dangerous content.
Described by Ofcom as a ‘reset for children online’, the codes are based on feedback from over 27,000 children and 13,000 parents and include more than 40 safety measures.
We’re building a safer online world for children across the UK.
From July, social media & tech companies must implement strong age assurance and protection methods to help shield children from harmful content. pic.twitter.com/WznXFpx2l0
— Department for Science, Innovation and Technology (@SciTechgovuk) April 24, 2025
Spanning social media, search engines, and gaming platforms, the codes outline over 40 safety measures including;
-
Safer content feeds through algorithmic filtering of harmful material
-
Stricter age checks, with the most high-risk platforms required to implement highly effective age assurance
-
Rapid response mechanisms to detect and address harmful content
-
Simplified reporting systems for children to flag inappropriate or dangerous content
-
Greater user control, allowing children to block or mute accounts and disable comments on posts
Services deemed ‘riskiest’ must deploy highly effective age assurance tools, while all platforms must enable children to block users, mute accounts, and disable comments.
Dame Melanie Dawes, Ofcom’s Chief Executive, said that the measures will lead to ‘safer social media feeds with less harmful and dangerous content’ and better protections from unsolicited contact and adult material.

Technology Secretary Peter Kyle hailed the move as a ‘watershed moment’ in fighting ‘lawless, poisonous environments’ online and in holding tech firms to account.
However, some bereaved parents argue the new rules fall short and that Ofcom’s approach prioritizes corporate profits over child safety and urged the Prime Minister to step in personally.
In response, a Meta spokesperson said that all UK teens on Instagram have been moved to accounts with enhanced protections and that under-16s need parental approval to modify safety settings.
Meta also supports app stores and operating systems taking responsibility for verifying a child’s age and obtaining parental permission before allowing app downloads.
The Online Safety Act, passed in October 2023, primarily focuses on protecting children, with these codes forming the backbone of Ofcom’s enforcement. Earlier this month, the regulator launched its first investigation under the new law into a suicide forum.