Webpages containing most extreme child sex abuse images 'doubles'

A record 51,370 webpages containing Category A sexual abuse material were found in 2022.

Number of webpages containing most extreme child sex abuse doubles, Internet Watch Foundation sayiStock

Images of children aged as young as seven being abused online have risen by almost two thirds while the number of webpages found to contain the most extreme material has doubled in recent years, according to a report.

As children become more active online, they are growing in vulnerability to grooming and abuse by strangers “even in their own bedrooms”, the Internet Watch Foundation (IWF) warned.

The foundation, which is the UK organisation responsible for tracking down child sexual abuse imagery on the internet, found a record 51,370 webpages containing Category A child sexual abuse material that it took action to remove or block from the internet in 2022.

This category can include the most severe kinds of sexual abuse, including babies and covering acts such as bestiality or sadism.

The amount of Category A content has doubled since 2020 when the IWF uncovered 25,000 pages with this kind of abuse, the foundation said.

In its annual report published on Tuesday, it said the total number of URLs found in 2022 containing Category A child sexual abuse material was higher than the organisation had ever seen before, accounting for a fifth of all the content the IWF sees – up from 17% in 2020.

The foundation, which works alongside industry and law enforcement to make sure such content is quickly taken down, also noted a 60% increase overall in the number of abuse images including children aged seven to ten years old, across all three categories.

In total, 255,571 webpages were confirmed as containing child sexual abuse imagery, having links to the imagery or advertising it – a 1% increase from 2021, the IWF said.

It added that each webpage could contain one, tens, hundreds or even thousands of individual child sexual abuse images or videos.

Category B images are those involving non-penetrative sexual activity, while category C covers other indecent images not falling within categories A or B.

The rise in the most severe images being found was branded “deeply disturbing” by the National Police Chiefs’ Council (NPCC) which warned of the “life-long harm it causes these children”.

The IWF annual report published on Tuesday, said: “As ever-younger children become more tech-aware and active online, they become more vulnerable to grooming and abuse by strangers – even in their own bedrooms.”

IWF chief executive Susie Hargreaves described the increase in reports including images and videos of the sexual abuse of children aged seven to ten as “heartbreaking”.

She said: “Sexual imagery created of children when they are online, often in the supposed safe spaces of their bedrooms, now accounts for almost four in every five reports.”

She warned that any child from any background can be vulnerable “as all young children left unsupervised with a camera-enabled device and an internet connection are at risk”.

The IWF repeated its opposition to the introduction of end-to-end encryption on platforms “without there being the necessary, technically-possible, child safeguards in place” as it warned that tech companies must do everything they can to prevent the upload and distribution of images.

 Internet Watch Foundation chief executive Susie Hargreaves

The foundation added: “Likewise, we have been working closely with colleagues across the UK Government to ensure that the Online Safety Bill does what it sets out to do and makes the UK a safer space to be online and as part of that, protect the critical work of the IWF as, without us, it will be children who will suffer.”

A senior analyst at the IWF, named only as Rosa, said: “People are now only one click away from Category A material. That is a public safety issue. This extreme material is no longer in the creepy corners of the internet. It’s in plain sight.”

The sites are typically not hosted by mainstream hosting companies but are, instead, mainly found on servers in little-known companies based in Europe or Asia, the IWF said.

It said the UK hosts a “small volume” of online child sexual abuse content, with 640 webpages displaying child sexual abuse imagery hosted in the UK last year – an increase from 381 in 2021.

The UK hosted 0.25% of all the child sexual abuse webpages which IWF identified in 2022.

While the majority (96%) of the imagery found showed girls, there has been a rise in imagery featuring boys, with 2,641 instances in 2021 compared to 6,253 in 2022.

Some of the most extreme sexual abuse is being perpetrated against the youngest children, the IWF said, with 81% of websites containing the abuse of children aged from birth to two years old, and half of the imagery of three to six years old containing Category A material.

Ms Hargreaves hailed a rise in the number of companies helping their aim to identify and remove such imagery as a positive sign.

She said: “In a year where we’re finding more imagery of the most severe types of sexual abuse, we’re also seeing more and more companies from around the world showing their determination to do something about it.”

Ian Critchley, NPCC lead for child protection and abuse investigations, said: “The rise in the most severe offending being found is deeply disturbing – not only are all internet users far more likely to be exposed to this harmful material, but it demonstrates once again how criminals have no regard for the life-long harm it causes these children.”

He said tech companies and platforms have a “moral and legal duty to keep children safe, and must not allow end-to-end encryption stop us from identifying and removing this abuse”.

Messaging services including Whatsapp warned last week in an open letter to ministers that they should “urgently rethink” the Online Safety Bill amid concerns the legislation would give regulator Ofcom the power to try to force the release of private messages on end-to-end encrypted communication services.

But the Government has argued that Ofcom will only be able to make companies use technology to identify child sexual abuse material in “appropriate and limited circumstances”, and the NSPCC children’s charity urged the ministers to resist the tech firms’ calls to water down the legislation.

STV News is now on WhatsApp

Get all the latest news from around the country

Follow STV News
Follow STV News on WhatsApp

Scan the QR code on your mobile device for all the latest news from around the country

WhatsApp channel QR Code