The Online Safety Act includes a legal requirement for tech platforms to keep children safe.
From December, big tech firms will have to publish their risk assessments on illegal harms on their platforms - The Online Safety Act is one year old. Has it made children any safer?
Indepth report
The instances of recording grooming has been increasing since the offence of Sexual Communication with a Child came into force in 2017, reaching a new record high of 7,062 this year.
Of the 1,824 cases where the platform was known in the last year, 48% were recorded on Snapchat.
Reported grooming offences on WhatsApp rose slightly in the past year. On Instagram and Facebook, known cases have fallen over recent years, according to the figures. All three platforms are owned by Meta.
online safety advocates point out how the design of Snapchat can make it a danger in itself. Messages and images on Snapchat typically disappear after 24 hours, and users can see if a recipient has screenshot or saved any images shared.
This makes criminality more difficult to track than on other platforms.
The NSPCC added the new findings reveal Snapchat and Meta platforms were the most popular platforms used by perpetrators to target children online last year.
It added girls were predominantly targeted by offenders for online grooming, making up 81 per cent of total UK recorded cases where gender was known in 2023/24.
Social media faces big changes under new Ofcom rules
Media regulator Ofcom, which will enforce those rules, said: "Our draft codes of practice include robust measures that will help prevent grooming by making it harder for perpetrators to contact children. We’re prepared to use the full extent of our enforcement powers against any companies that come up short when the time comes."
https://www.bbc.com/news/articles/cj0467e9e43o