New tech code launched for apps, toys and social media to better protect children online

Standards that would force tech giants to make children’s privacy online a primary consideration have been published by the UK’s data regulator.

The final Age Appropriate Design Code has been published by the Information Commissioner’s Office (ICO), which it hopes will come into effect by the autumn of 2021 pending approval from parliament.

Everything from apps to connected toys, social media platforms to online games, and even educational websites and streaming services, will be expected to make data protection of young people a priority from the design up.

The 15 provisions have been “clarified and simplified” since a draft was first revealed in April last year, after consulting with the industry and then being submitted to the Government in November.

Privacy settings should be set to high by default and nudge techniques should not be used to encourage children to weaken their settings, the code states.

Location settings that allow the world to see where a child is, should also be switched off by default.

Data collection and sharing should be minimised and profiling that can allow children to be served up targeted content should be switched off by default too.

“I believe that it will be transformational,” Information Commissioner Elizabeth Denham told the PA news agency.

“I think in a generation from now when my grandchildren have children they will be astonished to think that we ever didn’t protect kids online. I think it will be as ordinary as keeping children safe by putting on a seat belt.”

Ms Denham (pictured) said the gaming industry and some other tech companies expressed concern about their business model but overall the move was widely supported by them.

“We have an existing law, GDPR, that requires special treatment of children and I think these 15 standards will bring about greater consistency and a base level of protection in the design and implementation of games and apps and websites and social media.”

The code comes at a time of increased pressure on the tech industry to act on their possible impact upon people’s mental health.

Ian Russell, who believes access to suicide content on social media helped his teenage daughter Molly take her life in 2017, has welcomed the code.

“It is shocking that in failing to make the necessary changes quickly enough, the tech companies have allowed unnecessary suffering to continue,” he said.

“Although small steps have been taken by some social media platforms, there seems little significant investment and a lack of commitment to a meaningful change, both essential steps required to create a safer world wide web.

“The Age Appropriate Design Code demonstrates how the technology companies might have responded effectively and immediately.”

Andy Burrows, head of child safety online policy at the NSPCC, said the code would force social networks to “finally take online harm seriously and they will suffer tough consequences if they fail to do so”.

He said: “For the first time, tech firms will be legally required to assess their sites for sexual abuse risks, and can no longer serve up harmful self-harm and pro-suicide content.

“It is now key that these measures are enforced in a proportionate and targeted way.”

Facebook, which has been under the spotlight for its approach to the safety of its users, said: “We welcome the considerations raised by the UK Government and Information Commissioner on how to protect young people online.

“The safety of young people is central to our decision-making, and we’ve spent over a decade introducing new features and tools to help everyone have a positive and safe experience on our platforms, including recent updates such as increased Direct Message privacy settings on Instagram.

“We are actively working on developing more features in this space and are committed to working with governments and the tech industry on appropriate solutions around topics such as preventing underage use of our platforms.”

What are the 15 measures designed to protect children online?

The data regulator has set out 15 measures to make children’s privacy online a top priority for tech firms. But what does the Information Commissioner’s Office’s (ICO) final Age Appropriate Design Code tell companies to do?

1. Best interests of the child

The best interests of the child should be a primary consideration when designing and developing online services likely to be accessed by a child.

2. Data protection impact assessments

Firms should “assess and mitigate risks to the rights and freedoms of children” who are likely to access an online service, which arise from data processing.

They should take into account differing ages, capacities and development needs.

3. Age-appropriate application

A “risk-based approach to recognising the age of individual users” should be taken.

This should either establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from data processing, or apply the standards in this code to all users instead.

4. Transparency

Privacy information provided to users “must be concise, prominent and in clear language suited to the age of the child”.

5. Detrimental use of data

Children’s personal data must not be used in ways that have been “shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions or Government advice”.

6. Policies and community standards

Uphold published terms, policies and community standards.

7. Default settings

Settings must be set to “high privacy” by default.

8. Data minimisation

Collect and retain “only the minimum amount of personal data” needed to provide the elements of the service in which a child is actively and knowingly engaged.

Give children separate choices over which elements they wish to activate.

9. Data sharing

Children’s data must not be disclosed, unless a compelling reason to do so can be shown.

10. Geolocation

Geolocation tracking features should be switched off by default.

Provide an “obvious sign for children when location tracking is active”.

Options which make a child’s location visible to others must default back to off at the end of each session.

11. Parental controls

Children should be provided age-appropriate information about parental controls.

If an online service allows a parent or carer to monitor their child’s online activity or track their location, provide an “obvious sign to the child when they are being monitored”.

12. Profiling

Switch options which use profiling off by default.

Profiling should only be allowed if there are “appropriate measures” in place to protect the child from any harmful effects, such as content that is detrimental to their health or wellbeing.

13. Nudge techniques

Do not use nudge techniques to “lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections”.

14. Connected toys and devices

Connected toys and devices should include effective tools to ensure they conform to the code.

15. Online tools

Children should be provided with prominent and accessible tools to exercise their data protection rights and report concerns.

Copyright (c) PA Media Ltd. 2020, All Rights Reserved. Picture (c) The Information Commissioner.

Share On: