Deluge of false reports hindering efforts to remove online child abuse images

Analysts fighting online child abuse have been deluged with tens of thousands of false reports which could be hindering their vital work, a charity has warned.

The Internet Watch Foundation (IWF) said the stream of inappropriate and invalid reports could be preventing and delaying it from tracking down and tackling material which does show abuse.

The Cambridge-based charity works to remove images and videos of child abuse from the internet and acts on anonymous reports from the public to find and eradicate criminal content.

Last year, the public made 106,830 reports to IWF where the person said they were reporting child sexual abuse material.

Of these, analysts working for the charity processed 77,160 reports which turned out to be false, meaning just over a quarter (29,670) of the reports matched the criteria for which action could be taken.

One particular individual has made 8,300 false reports since June 2019 despite being “repeatedly informed” the nature of their reports is not in the charity’s remit, a spokesman said.

The reports the person made are thought to be mostly related to innocuous images from Instagram.

Other invalid reports include adult porn, pictures of child models and even holiday snaps – none of which were deemed to be breaking any laws.

But some have included distressing videos of beheadings and animal cruelty, the charity said.

The wrong reports are estimated to cost the organisation £150,500 a year.

The time taken to deal with them equates to four years of work by a single analyst – calculated based on an approximate average of five minutes to assess each report and a typical shift pattern over a year.

The charity launched a fresh page on its website on Tuesday in a bid to make it clearer to the public what to report and how.

A senior analyst, known by pseudonym Peter Williams because of the sensitive work he carries out, said: “We don’t expect people to be able to make their own assessments of criminal content on the internet – that’s what we’re here for.

“But by reporting anything and everything to us, when we’re here to deal with one really serious online criminality, takes up time and resources and diverts our efforts away from the victims.

“There could have been thousands of criminal sites that we could be getting offline – thousands of illegal images of children being sexually abused we could be removing from the internet.

“We are instead dealing with reports of something that we know we can’t do anything about.”

The not-for-profit organisation, which receives funding from the European Union among other donations, has analysts who deal with reports made by members of the public.

The trained experts, who receive regular counselling, view the material to determine whether the material is criminal and work with internet companies to have it removed from websites.

The charity works with police all over the world to report any content which could suggest a child is in danger to help protect victims and bring abusers to justice.

The charity advises the public to:

  • Anonymously report images and videos of child sexual abuse to the IWF to be removed. This can include other visual depictions like computer-generated images.
  • Provide the exact URL where the material was found.
  • Do not report other harmful content – but use the charity’s list of other bodies to direct concerns to the right one.
  • Report child welfare concerns to the police.
  • Do not repeatedly report the same material.

– Material which meets the criteria for the charity to take action includes pornographic, explicit and grossly offensive images of children and anything which shows sexual activity involving or in the presence of a child.

Chief executive Susie Hargreaves said: “If people stumble across these images online, they need to know we are a safe place they can turn to.

“You can report anonymously to us and we will get material analysed and removed.

“What we can’t do is remove material that is not actually against the law.

“Our analysts still have to look carefully at material to make sure there is nothing criminal hidden in there and, if people are reporting inappropriate things to us, it takes up a lot of their time.”

Copyright (c) PA Media Ltd. 2019, All Rights Reserved. Picture (c) Yui Mok / PA Wire.

Share On: