The NSPCC’s Childline service and the Internet Watch Foundation (IWF) have teamed up to launch a new tool, Report Remove, that enables children aged under 18 to report nude images or videos of themselves online for review and removal by the IWF should it break the law.
This is the first time the IWF has accepted images and videos directly, as opposed to taking the URLs of where the images are being hosted, as it would usually do on its hotline.
The Report Remove tool has been developed in collaboration with UK law enforcement in a child-centric approach that preserves the victim’s confidentiality and anonymity. This approach, crucially, also helps to ensure that children will not be unnecessarily visited by the police when they make a report, saving victims further embarrassment or distress.
The tool also creates a hash, or digital fingerprint, from the image, which can be used to ensure the image cannot subsequently be shared more widely or uploaded online via social media.
IWF chief executive Susie Hargreaves said: “When images of children and young people are taken and spread around the internet, they lose control. This is about giving them that control back. Once those images are out there, it can be an incredibly lonely place for victims, and it can seem hopeless. It can also be frightening, not knowing who may have access to these images.
“This tool is a world first. It will give young people the power, and the confidence, to reclaim these images and make sure they do not fall into the wrong hands online.”
Cormac Nolan, service head of Childline Online, added: “The impact of having a nude image shared on the internet cannot be underestimated and for many young people, it can leave them feeling extremely worried and unsure on what to do or who to turn to for support.
“That is why Childline and the IWF have developed Report Remove to provide young people a simple, safe tool that they can use to try and help them regain control over what is happening and get this content erased.
“At Childline, we also want to remind all young people that if they discover that a nude image of themselves has been shared online, they do not need to deal with this situation alone and our Childline counsellors are always here to listen and help provide support.”
There are various circumstances in which a young person may share a self-generated sexual image. Some may have sent pictures for fun, or to a boyfriend or girlfriend, which are later shared without consent, while others may have been blackmailed by bullies, or groomed online by adult predators.
The IWF said it had seen reports of self-generated images more than double in the first three months of 2021, to 38,000 from 17,500. Inevitably, some of this increase will correlate to Covid-19 lockdowns.
The tool, which was piloted at the beginning of 2020, is backed by Yoti, a supplier of digital identity services, and users aged over 13 will have to verify their age to use it, which means they will need some form of identification. This can be a UK passport or biometric residence permit; a driving licence, including provisional; or a Young Scot Card or CitizenCard within the Yoti app.
For those who do not have access to these, discounted CitizenCards can be obtained using an NSPCC-supplied promotional code.
After having proved their age using the Yoti app – which only has access to this information, not any reported images – users will be directed back to Childline to make a report through the IWF.
More details of how the Report Remove tool works can be found here.