THE number of crimes involving images of child abuse in North Yorkshire has rocketed, according to the latest figures.

According to the latest numbers from North Yorkshire Police the figures have gone from 319 in 2021/2 to 539 in 2022/3 a 69 per cent increase, and more than 2,400 offences have been recorded by the force in since 2017/18.


Read next:


Children's charity the NSPCC, who obtained the data, are urging tech companies to improve their platforms and for Ofcom to strengthen its approach to tackling child sexual abuse online.

York Press: The NSPCC are calling for urgent actionThe NSPCC are calling for urgent action

More than 33,000 offences where child abuse images were collected and distributed, were logged in 2022/23 according to Freedom of Information data from 35 police forces across the UK. 

The charity said the figures show the need for swift and ambitious action by tech companies to address what is currently happening on their platforms and for Ofcom to significantly strengthen its approach to tackling child sexual abuse through effective enforcement of the Online Safety Act.

The new data shows the widespread use of social media and messaging apps in child sexual abuse image crimes which the NSPCC say results largely from a failure to design child safety into products.

It comes as insight from Childline shows young people being targeted by adults to share child abuse images via social media and the calculated use of end-to-end encrypted private messaging apps by adults to find and share child abuse images.

York Press: Childline have responded to the dataChildline have responded to the data

A 14-year-old girl, who has not been named, told the NSPCC-run service: “One night I got chatting with this guy online who I’d never met and he made me feel so good about myself. He told me he was 15, even though deep down I didn’t believe him. I sent him a couple of semi-nudes on Snap(chat), but then instantly regretted it. I asked him to delete the pics, but he just kept on making me do stuff for him not to post them – like getting me to strip live on camera. I just want to block him, but if I block him he will just post the pictures.”

A 15-year-old boy, who also remained anonymous, told Childline: “A while ago I saw a video on YouTube about how a guy was busting paedophiles and creeps on the internet by pretending to be a kid, and I kind of wanted to do a similar thing.

"I looked around Instagram for the creepiest accounts about kids my age and younger. In the end, I came across this link on one of their stories. It’s a link to a WhatsApp group chat in which child sexual abuse material is sent daily. There are literally hundreds of members in this group chat and they’re always calling the kids ‘hot’ and just being disgusting.”

A consultation into Ofcom’s first codes for companies to adopt to disrupt child sexual abuse on their platforms closed last week.

The NSPCC want these measures introduced without delay, but urged Ofcom to begin work on a second version of the codes that will require companies to go much further.

The charity said companies should be required to use technology that can help identify and tackle grooming, sextortion and new child abuse images.

They also want tougher measures for private messaging services to make child protection a priority, including in end-to-end encrypted environments.

Sir Peter Wanless, NSPCC Chief Executive, said: “The Online Safety Act sets out robust measures to make children fundamentally safer on the sites and apps they use so they can enjoy the benefits of a healthy online experience.

“Ofcom has been quick off the blocks but must act with greater ambition to ensure companies prioritise child safety in the comprehensive way that is so desperately needed.”

York Press: Sir Peter WanlessSir Peter Wanless

Susie Hargreaves OBE, Chief Executive of the Internet Watch Foundation, the UK’s front line against child sexual abuse imagery online, said: “This is a truly disturbing picture, and a reflection of the growing scale of the availability, and demand, for images and videos of children suffering sexual abuse.”

A Snapchat spokesperson said: “Child sexual abuse is horrific and has no place on Snapchat. We use cutting-edge detection technology to find and remove this type of content, and work with police to support their investigations. Snapchat also has extra safety features for 13 - 17 year olds, including pop-up warnings if they’re contacted by someone they don’t know.”