London: Roblox has implemented extensive new age-based controls as part of a global effort to strengthen protections for younger users.
The platform, which attracts more than 80 million daily players with about 40 percent under the age of 13, has faced increasing scrutiny over exposure to inappropriate content and contact with adults.
Mandatory age checks for Roblox chat features have been introduced, beginning in Australia, New Zealand and the Netherlands in December, followed by a wider rollout in January. The update has aimed to reduce interactions between children and unrelated adults, responding to investigations and lawsuits in several US states where concerns about child safety have grown.
Under the new system, users must complete facial age estimation using the Roblox app’s camera. The technology, managed by an external provider, processes images temporarily before deleting them after verification.

Once verified, users are assigned to distinct age groups such as under nine, nine to twelve, thirteen to fifteen, sixteen to seventeen, eighteen to twenty and twenty-one plus. Chatting is restricted to similar age ranges, with interaction across groups allowed only for ‘trusted connections’ added manually.
Chatting is restricted to similar age ranges, with interaction across groups allowed only for ‘trusted connections’ added manually. Parents can continue to manage accounts and update ages after verification.
For under-13 users, restrictions on private messaging and specific chat functions remain unless parental approval is provided. Roblox has highlighted that images and videos are still prohibited within chats and that link sharing remains heavily restricted.
The changes follow concerns raised in earlier testing, where it was possible for a 27-year-old user and a 15-year-old user on separate devices to exchange messages despite platform rules. Regulators and safety groups have welcomed the new measures.

Ofcom’s online safety supervision director Anna Lucas has said that platforms must take active steps to keep young people safe under the UK’s Online Safety Act. The NSPCC has also called for continued improvements that genuinely protect children from manipulation and abuse.
In the US, Roblox is facing legal action in Texas, Kentucky and Louisiana, adding pressure on the company to prove that safety reforms are effective. The company has stated that the updated framework will create age-appropriate experiences and expects other gaming platforms to adopt similar verification systems.
Campaign groups ParentsTogether Action and UltraViolet have staged a virtual protest within Roblox, delivering a digital petition signed by more than 12,000 people. The petition has urged the platform to strengthen its safety policies further and ensure robust protections against adult predatory behaviour.
Roblox has described the new approach as a step designed to limit adult-child communication, encourage safer interactions and reduce long-standing risks within the gaming environment.

