A new report by the Internet Watch Foundation (IWF), the first analysis of its kind, has revealed widespread online sexual abuse of three to six-year-old children while they are using household devices.
Thousands of images and videos analysed by the IWF show children being manipulated into “disturbing” sexual acts on camera, including penetration, masturbation, bestiality, and sadism. Described as ‘self-generated’ child sexual abuse imagery, where the perpetrator is remote from the victim, these images and videos have then been shared by perpetrators on the open internet in the UK.
This content could therefore be accessed by people in the UK, but did not necessarily only include victims from the UK itself – while the UK hosts relatively little child sexual abuse content, the National Crime Agency (NCA) has identified it as the third-largest global consumer of child sexual exploitation material.
2,401 individual self-generated images and videos of children aged three to six were discovered across 2023, 91 per cent of which were of girls. 15 per cent of this content showed the most extreme (Category A) forms of sexual abuse.
The data also showed a record number of webpages containing child sexual abuse overall, with 275,652 webpages discovered – each of which can contain thousands of images or videos. The nature of the abuse has also become more extreme, with a 22 per cent increase in 2023 of webpages containing Category A (involving penetrative sexual activity, sexual activity with an animal or sadism) child sexual abuse material, the most extreme number on record. The IWF has seen a 38 per cent increase in Category A imagery since 2021.
This shocking report has prompted Security Minister Tom Tugendhat to call on technology companies to “urgently” tackle the issue.
“This report from the Internet Watch Foundation is devastating,” he told PoliticsHome.
“The thought of children aged between three and six being targeted by predators in their own home is horrifying, and it’s crucial that tech companies agree to work with the government and children’s charities to tackle this issue urgently.”
The government has already passed the Online Safety Act, which has led to Ofcom setting out safety duties and publishing codes of practice to encourage firms to put measures in place to protect children – however delivery of the full regulatory regime is not expected until 2026, leaving it up to tech firms to determine to what extent they comply with Ofcom’s guidelines until then.
Tugendhat said that the government “now needs technology companies to play their part too” and that “we cannot afford not to act”.
A Home Office source told PoliticsHome that the department was particularly concerned about the increasing rollout of end-to-end encryption by top technology firms – Meta, for example, introduced encryption to its messaging services in December.
The source said there was “very little evidence” that firms were putting in robust safeguards against online child sexual abuse, when on average UK police make 800 arrests a month of suspected sexual predators and safeguard 1,200 children a month from child sexual exploitation offences. The Home Office wants to “take the onus off children” to report abuse, placing more responsibility on tech firms.
Labour MP Sarah Champion, who has been a long-time campaigner against child sexual exploitation, said the report showed the Online Safety Act had been a “missed opportunity” to put in “really robust child protection measures”. The MP said that while she was hopeful that a Labour government would take further steps in addressing online violence and exploitation – “because of our leaders background, he’s been very front-footed about ending violence against women and girls” – she would like to see Labour tackle the “root causes” of sexual violence.
“I am optimistic that this is definitely in the scope… I’ll be mithering them if it isn’t.”
Although the IWF has welcomed the Online Safety Act, their Chief Executive Susie Hargreaves OBE insisted that “we can’t afford to wait until these codes come in”.
“The harms are happening to children now, and our response must be immediate,” she said.
Ian Critchley, National Police Chiefs’ Council (NPCC) lead for Child Protection, agreed that the responsibility could not fall solely on parents and carers.
“The biggest change though we must see is from the tech companies and online platforms,” he said.
“Companies are still failing to protect children and continue far too often to put profit before child safety. I welcome the Online Safety Act, but it should not have required this developing legislation to change the negligible approach to child safety by too many companies.”
The major social media platforms insist they already have robust systems in place. According to a TikTok spokesperson, TikTok removes any content that depicts or disseminates child abuse or sexual exploitation of children as soon as they become aware of it through their own detection methods, community reports, or industry partnerships, before reporting cases to authorities.
Research by Ofcom published last week showed that a third (32 per cent) of parents of five to seven-year-olds say they allow their child to use social media independently, compared to 42 per cent who say they use social media sites and apps together with their child. The IWF has therefore also called for a “whole society approach” and for children under six to be warned about online dangers via education and conversations at home.
“The opportunistic criminals who want to manipulate your children into disturbing acts of sexual abuse are not a distant threat – they are trying to talk to them now on phones and devices you can find in any family home,” Hargreaves said.
“If children under six are being targeted like this, we need to be having age appropriate conversations, now, to make sure they know how to spot the dangers. A whole society approach is needed.”
Champion said that having got her first phone at the age of 26, she believed the delays in tackling online abuse of children was somewhat generational: “Most people even if they wanted to until very recently wouldn’t have known it was going on.”
But she added that to some extent, there is “willful ignorance” among many parents.
“I do think that parents don’t want to know and then it’s diminished and belittled and it’s also messy and unpleasant,” she said.
“To know a six-year-old is being faced with bestiality and self-generating images, a lot of people back away from it.”
Champion also expressed concern about the “very, very effective” lobbying carried out by top tech firms who insisted they are doing everything they can to protect children online.
“They’re very slick and they’re very persuasive… TikTok was taking MPs out to dinner at the beginning of this last Parliament,” she said.
“They tell you to your face that they’re doing everything that they possibly can, and you want to hear that. So I think there’s been a lot of complacency in this place. Just look at the Online Safety Act: it’s okay, but it’s nowhere near good enough to address the problem.”
Shadow Minister for Safeguarding Alex Davies-Jones told PoliticsHome: “This report is a damning indictment on the way that online safety has been approached by the Government in recent years.
“The stark increase in this type of child sex abuse imagery shows that things need to change, and they need to change now. A Labour Government would significantly strengthen the Online Safety Act, closing the loopholes that could allow abuse to flourish, and work with international partners to address this global problem. There is no time to waste.”
Meta has been contacted for comment.