NSPCC Urges Action After Child Sexual Abuse Findings Revealed

Charity calls for stronger online protections after data reveals alarming rise in child abuse image crimes.

NSPCC Urges Action After Child Sexual Abuse Findings Revealed
NSPCC Urges Action After Child Sexual Abuse Findings Revealed

Police recorded many child abuse image crimes. Last year saw 38,685 such offences in England and Wales. That is over 100 crimes every single day.

A charity looked at which platforms were used. Law enforcement knew the platform in 7,338 cases. Half of those crimes happened on Snapchat, and a quarter occurred on Meta platforms.

Instagram saw 11% of the offences. Facebook accounted for 7%, while WhatsApp was used in 6% of these crimes. This data excludes Facebook Messenger entirely.

The charity asked all police forces for data. Twenty forces gave usable data for 2023/24. They revealed platforms in 7,338 instances of abuse. Snapchat was flagged 3,648 times in total. Instagram had 840 flags.

Facebook received 537 flags. WhatsApp was flagged 457 times. Kik had 124 flags, about 2%. TikTok had 245 flags, 3%. X, formerly Twitter, had 250 flags, about 3%. YouTube had 330 flags, about 4%.

Home Office data showed 38,685 offences occurred. Some police have resisted sharing social media data. They cited time or cost as reasons.This year’s data came from a keyword search, and it may not cover every method used.

Child abuse image offences also rose in some areas. Northern Ireland had a record 859 such crimes. Scotland saw 748 crimes, still very high.

Charities wrote to government officials about this, sending a letter to two government leaders. The charities worry about Ofcom’s practices and think kids will not be safe online. The charities feel children lack protection in private messages. This issue goes against the Online Safety Act.

Ofcom says services must remove illegal content when feasible. The charities see this as a loophole, where some services could avoid child protection. Police data links private messaging to most crimes. These spaces let abusers hide their actions.

The NSPCC wants stricter rules from Ofcom, as well as safeguards on private messaging apps, including apps with end-to-end encryption. End-to-end encryption protects user privacy. It can also blind providers to abuse material.

Childline has evidence of targeting via private apps. They provided 903 counseling sessions for blackmail cases in 2023/24. This number is a 7% increase. One thirteen-year-old girl shared her experience. She sent photos to a stranger on Snapchat, and he then threatened to post the pictures online.

Chris Sherwood, NSPCC Chief, is deeply worried. He said these offences cause tremendous distress. He’s outraged that tech firms disregard safety. He argued separate rules for messaging let tech bosses off.

Crimes continue even with the Online Safety Act. The government must hold tech companies accountable. They need to protect children from online abuse.

Image Credits and Reference: https://www.bournemouthecho.co.uk/news/national/uk-today/24943192.nspcc-calls-action-following-shocking-child-sexual-abuse-findings/?ref=rss
Disclaimer: The images on this site are for info only and follow fair use. We get them from public sources and try to stick to official ones. If you have any concerns, please reach out to us.
Fact-Checking Policy: We use reliable sources and check info before posting. Mistakes can happen, so if you spot one, please let us know, and we’ll fix it ASAP.

Local news team dedicated to accurate crime and community reporting within the Liverpool area. Email: dodoxler+pool@gmail.com