With children spending more time online, their safety in the digital environment is a key focus for regulators in the UK, particularly under the newly enacted Online Safety Act (OSA).
One way of keeping children safe online is to ensure they are not accessing content or services designed for an adult audience. Verifying the age of a user online is challenging with most options amounting to nothing more than what some commentators call "verification theatre" on the basis that a tick box asking a user to confirm they are over 18 pretends to verify their age but in reality achieves very little and is easy to circumvent.
More sophisticated methods of age verification and assurance can certainly be used but risk breaching data protection law if they do not adhere to (UK) GDPR principles including data minimisation and security. Those online service providers caught by mandatory provisions to use age verification under the Online Safety Act are among those who will have to invest in data protection-compliant age verification and assurance methods to ensure they comply with their legal obligations.
Age assurance includes methods used to:
Age verification is any method designed to verify the exact age of users or confirm that a user is over 18.
UK GDPR provisions are supplemented by the ICO's Children's Code, a statutory code of practice comprising 15 principles to help ISS likely to be accessed by children to embed age-appropriate design and protect children's information rights online. ISS include virtually all apps, programs, search engines, social media platforms, online marketplaces, online games, websites and content streaming services. Age assurance and age verification can play a part in compliance.
The ICO published an updated opinion on age assurance in October 2023, which replaces its previous opinion provided in October 2021. This opinion is written for ISS and age assurance providers with the aim of explaining how they can use or develop technology in compliance with data protection laws in the UK and deploy it in a proportionate manner. It has been updated partly to take account of the OSA, which received Royal Assent in October 2023.
The OSA places obligations on in-scope user to user and online search services whose services are likely to be accessed by children, to protect them from illegal and harmful content. Safety duties under the OSA require the use of proportionate measures to mitigate and manage the risk of harm to children. For services where there is a risk of children encountering primary priority content harmful to children (as defined in the OSA), age verification or estimation must be used unless the terms of service indicate that such content is prohibited, and the policy applies to all users.
The OSA obligations are separate to those under the Children's Code. If you are in scope of the OSA and processing personal data you must also comply with data protection law including with the Children's Code. Not all organisations in scope of the Children's Code will be caught by the OSA, but there is inevitably some overlap as we discuss in more detail here.
There are a number of techniques available to help organisations estimate or verify the age of users. These include:
While neither the Children's Code nor the OSA mandate the adoption of any one solution, the ICO's opinion says, in relation to data protection considerations, that the method used should be proportionate to the risks posed by the processing which, in turn, will inform what level of age certainty is required.
Where services are high risk, methods with the highest possible level of certainty on the age of users should be adopted so, for example, self-declaration on its own would not be appropriate. Organisations should be able to demonstrate that they have considered a wide range of assurance options and should evidence the rationale behind adoption of a particular method. The ICO acknowledges that technical feasibility and the age range and capabilities of users will result in a variance of age certainty across services.
The ICO warns that age assurance should be used carefully as it carries its own risks including that age assurance can:
We have seen the rapid rise of AI in the past 12–18 months which has prompted the ICO to issue warnings in relation to the use of AI-driven age assurance methods. Some AI -driven age assurance methods use biometric data which in many cases will constitute special category data and additional protections will apply. Where profiling is used by AI-driven age assurance methods you must balance the risks posed by the use of the profiling against the benefits in helping establish the age of users. Care should be taken to ensure bias is addressed and users are not discriminated against and you must ensure that the methods are sufficiently statistically accurate.
The ICO will continue to work with stakeholders both internationally and in the UK, including Ofcom, to build a coherent approach to regulating the safety of children online. The ICO's work and engagement on international standards on age assurance technologies will continue. It is hoped that these standards will provide further clarity on what will be expected from providers from a technical perspective when implementing age assurance systems. The ICO has confirmed that it intends to replace this updated opinion in due course. We also expect further guidance and clarification from Ofcom in relation to using age assurance and verification to comply with the OSA, so watch this space.
Victoria Hordern looks at the impact of AI on children, and at the role of AI and data protection legislation in protecting them from potential AI-related harms.
1 of 5 Insights
Debbie Heywood looks at what the ICO's Children's Code and the Online Safety Act mean by the term "likely to be accessed by children" and at overlaps and differences in requirements.
3 of 5 Insights
Sheppard Mullin's Liisa Thomas and Kathryn Smith look at what businesses operating in the US need to do to comply with current and incoming children's privacy laws.
4 of 5 Insights
ECIJA's Teresa Pereyra Caramé and Rubén Lahiguera Gallardo look at the Spanish Data Protection Authority's practical guide on protecting minors from inappropriate online content.
5 of 5 Insights