United Kingdom: The UK’s Online Safety Act (OSA) will take full effect by 16th March 2025, requiring online platforms to assess whether their services expose users to illegal content or face hefty fines of up to 10 percent of their global turnover.
Ofcom, the regulator overseeing the law, has published its final codes of practice, outlining how firms must manage illegal material on their platforms.
New Requirements for Risk Assessments
Online platforms are now required to conduct comprehensive risk assessments to identify potential harms related to an illegal content, such as child sexual abuse material (CSAM), coercive behaviour, and self-harm promotion. If companies fail to address these issues, they risk financial penalties.
According to Ofcom head Dame Melanie Dawes, this is a “last chance” for companies to implement significant changes in their operations before enforcement begins in March 2025.
📋 @Ofcom has released its first codes of practice for the Online Safety Act
🤝 We’re urging Ministers to meet with key campaigners who are raising the alarm about gaps in these regulations to urgently address their concerns pic.twitter.com/zBQlUdXjdJ
— Victoria Collins MP (@TweetingCollins) December 16, 2024
“If they don’t start to seriously change the way they operate their services, demands for tougher actions, such as banning children from social media, will intensify,” Dame Melanie Dawes warned.
While the OSA is widely hailed as a step towards safer online spaces, critics argue that the Act falls short of addressing a broad range of harms, particularly those affecting children.
Ofcom’s strengthened Codes of Practice
The final Ofcom codes provide greater clarity on the removal of intimate image abuse and child exploitation content while introducing measures to protect children from harmful social media interactions.
Among the requirements, platforms must use hash-matching technology to detect CSAM. This technology assigns a unique digital signature to media files, allowing them to be compared with databases of known CSAM, helping to quickly identify illegal content.
In response to rising concerns, major platforms like Facebook, Instagram, and Snapchat have already introduced safety features for younger users, including restrictions on who can message minors and who can discover their profiles. Instagram has also introduced measures to prevent sextortion by blocking screenshots of direct messages.
Technology Secretary Peter Kyle praised Ofcom’s new codes, calling them a “significant step” toward creating a safer internet for UK users.
Despite the progress, campaigners have raised concerns over the wide-reaching implications of the OSA, particularly regarding platform age verification and the potential privacy issues involved.
The illegal content codes are still awaiting final parliamentary approval but are expected to come into effect on 17th March 2025.