Internet technology and social media firms failing to protect children from abuse, inquiry finds
Technology giants and social media firms have “failed” in attempts to prevent access to child sex abuse images, allowing for “an explosion in online-facilitated” crimes against children, an inquiry has found.
Industry leaders such as Microsoft, Facebook, Google and Apple have all struggled to get to grips with “the scale of the problem on their platforms and services”, and should “do more to identify the true scale of the different types of offending”, such as child grooming, the Independent Inquiry into Child Sex Abuse report found.
It said regulation of the internet “was now required”, and called on the Government to press industry leaders into a raft of action designed to limit abuse, including pre-screening images uploaded to the web and to introduce new age-verification technology.
Professor Alexis Jay (pictured), inquiry chairwoman, said: “The serious threat of child sexual abuse facilitated by the internet is an urgent problem which cannot be overstated.
“Despite industry advances in technology to detect and combat online facilitated abuse, the risk of immeasurable harm to children and their families shows no sign of diminishing.
“The panel and I hope this report and its recommendations lead internet companies, law enforcement and government to implement vital measures to prioritise the protection of children and prevent abuse facilitated online.”
The report is based on 14 days of public hearings held in January 2018 and May 2019, during which the Met – Britain’s biggest police force – said it witnessed a 700% spike in the number of online child abuse cases referred to them by national investigators over three years.
It also heard how live-streaming websites were “enabling” paedophiles to widely share videos of child sexual abuse by failing to effectively combat the threat.
In its 114-page report, published on Thursday, the inquiry made four recommendations to Government. These were:
- To require industry to pre-screen material before it is uploaded to the internet to prevent access to known indecent images of children;
- To press the WeProtect Global Alliance – a group comprising 97 governments, 25 technology companies and 30 civil society organisations – to take more action internationally to ensure that those countries hosting indecent images of children implement legislation and procedures to prevent access to such imagery;
- To introduce legislation requiring providers of online services and social media platforms to implement more stringent age verification techniques on all relevant devices, and;
- To publish, without further delay, the interim code of practice in respect of child sexual abuse and exploitation as proposed by the Online Harms White Paper.
The Government is currently working on new legislation around online harms, including placing a statutory duty of care on tech companies to keep their users safe, overseen by an independent regulator.
Earlier this month the UK joined the US, Canada, Australia and New Zealand in formally launching the Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse, which detailed actions tech companies should take to protect younger users on their platforms.
The pledges range from stopping existing and new child sex abuse material appearing on platforms to taking steps to stop the live streaming of abuse, and identifying and stopping grooming and predatory behaviour.
The proposals were endorsed by tech giants including Facebook, Google, Microsoft, TikTok, Twitter and Snap.
But Thursday’s report identified how there were no evident barriers to pre-screening images.
It said: “Industry has failed to do all it can to prevent access to images of child sexual abuse.
“The time has come to stop access to such imagery by requiring industry to pre-screen material. No industry witness said that such a step was technologically impossible.”
It said there had “been an explosion in online-facilitated child sexual abuse” and said “law enforcement is struggling to keep pace”.
The report also found that indecent images of children could “be accessed all too easily”, saying that the child involved was re-victimised each and every time the image was viewed.
The report said: “The time has come for the Government to stop access to indecent images of children by requiring industry to pre-screen material.”
It added that while there was evidence of “the positive intentions by industry to tackle online-facilitated child sexual abuse and exploitation”, there was “a lack of a coherent long-term strategy on how this is to be achieved”.
The report concludes the latest strand of the inquiry, which has also focused on the role of the political establishment in dealing with allegations of child sexual abuse.
Responding, David Miles, Facebook’s head of safety for Europe, Middle East and Africa, said the social media giant was an “industry leader” in combating child sexual exploitation.
He added: “We have made huge investments in sophisticated solutions, including photo and video-matching technology so we can remove harmful content as quickly as possible.
“As this is a global, industry-wide issue, we’ll continue to develop new technologies and work alongside law enforcement and specialist experts in child protection to keep children safe.”
Sue Hargreaves, chief executive of the Internet Watch Foundation charity which contributed evidence to the inquiry, welcomed the report and said there was “no room for excuses”.
She said: “This report has not only identified the scale of the abuse and the challenges being faced, but also highlighted that the time has come for action.
“There is no longer any reason not to be decisive on taking action against the predators who exploit and abuse children online.
“This report makes it abundantly clear there is no room for excuses.”
Andy Burrows, head of child safety online policy at the NSPCC, described the report as “a damning indictment of Big Tech’s failure to take seriously their duty to protect young people from child abuse, which has been facilitated on their platforms on a massive scale”.
Labour MP Tracy Brabin, shadow secretary for digital, culture, media and sport, added: “It is utterly damning that major tech companies are not doing all they can to prevent access to images of child abuse online.”
Barnardo’s chief executive, Javed Khan, added: “We continue to urge the Government to act swiftly to regulate the internet and enforce serious sanctions for companies that break the rules and leave children in danger.”
Copyright (c) PA Media Ltd. 2020, All Rights Reserved. Picture (c) The Independent Inquiry into Child Sex Abuse.