London: Online grooming crimes in the United Kingdom have nearly doubled since 2017, reaching their highest level on record, according to new figures from the National Society for the Prevention of Cruelty to Children (NSPCC). The charity described the eight-year rise as ‘deeply alarming,’ warning that children as young as four have fallen victim to online predators.
In the year leading up to March 2024, police recorded 7,263 online grooming offences, compared to 3,728 offences in the 12 months ending March 2018. The findings are based on data gathered through freedom of information requests sent to police forces across England, Wales, Scotland, and Northern Ireland.
The NSPCC’s report revealed that a technology platform was identified in over 2,100 cases of online grooming, with Snapchat being the most common, linked to approximately 40 percent of all recorded offences. WhatsApp, Facebook, and Instagram, all owned by Meta, were each connected to 9 percent of cases.
Among the victims whose gender was known, 80 percent were girls, though the youngest recorded victim was a four-year-old boy. The charity did not specify which police force handled this case, citing concerns about the child’s privacy.

Experts suggest one reason for the increase in recorded crimes may be the introduction of the sexual communication with a child offence in April 2017 in England and Wales, which targets groomers contacting minors through mobile phones or social media. Northern Ireland has recorded the offence since 2015, while Scotland introduced a similar law in 2010.
The report also referenced the recent Online Safety Act, introduced to strengthen online child protection measures. It highlighted a landmark case last month, where a married man became the first person in the UK jailed for encouraging a child to self-harm, after creating a secret online world to manipulate and abuse a young girl.
Despite the recorded surge, the NSPCC cautioned that the actual number of victims may be significantly higher, since each police-recorded offence could involve multiple victims or communication methods. The charity added that many offences occur in private online spaces, making them harder to detect.
Matthew Sowemimo, the NSPCC’s Associate Head of Child Safety Online, noted that Snapchat’s popularity among children, used by nearly three-quarters of British minors, likely contributes to its high association with grooming cases. Sowemimo explained that the app’s ‘quick add’ feature enables adults to directly contact a large number of child users, making it easier for predators to initiate communication.

The NSPCC warned that offenders are continuously adapting their methods across multiple platforms, often creating fake profiles and using cross-platform manipulation to build trust with victims. The charity urged tech companies to analyse metadata, such as repeated adult contact with minors or multiple fake accounts, to identify suspicious behaviour. It clarified that this would not require reading private messages but could help flag potential grooming activity.
In response, Snapchat stated that it collaborates with law enforcement and child safety organisations to remove abusive activity and block potential offenders. The platform restricts teen visibility in search results and requires mutual connections or existing phone contacts before direct messaging is possible.
Meta also said that it uses technology to proactively detect and remove child exploitation content, reporting that between January and March 2024, it removed over six million pieces of such content from Facebook and Instagram, with 97 percent identified before being reported. The company maintained that it already provides user protections recommended by the NSPCC.
The NSPCC concluded that while technology firms have made progress, further action is needed to detect online grooming, protect minors, and ensure that online spaces are safe for children across the UK.

