The ICO's Children's Code consists of 15 principles which information society service providers are required to implement where their services are "likely to be accessed by children" – ie under-18s - in order to embed age appropriate design where personal data is processed in the provision of the services.
The UK's Online Safety Act which was enacted on 26 October 2023 requires online user-to-user and search services to protect users from certain types of illegal and harmful content. Additional obligations apply to in-scope services "likely to be accessed" by children.
Not all organisations caught by the Children's Code will be impacted by the OSA, but those caught by both will need to understand what "likely to be accessed" by children means for each.
Children's Code
The ICO has resisted requests to put a numerical threshold on what "likely to be accessed" means. The Children's Code itself says: "we consider for a service to be 'likely' to be accessed, the possibility of this happening needs to be more probable than not…In practice, whether your service is likely to be accessed by children or not is likely to depend on:
- the nature and content of the service and whether that has particular appeal for children, and
- the way in which the service is accessed and any measures you put in place to prevent children gaining access".
The Code refers to children forming a "substantive and identifiable user group". The ICO says a common sense approach should be applied.
The ICO has also produced guidance in the form of FAQs and case studies on what "likely to be accessed by children" means for the purposes of the Code. This was updated in October 2023 to add further clarifications.
Where children form a substantive and identifiable user group, the "likely to be accessed" definition will apply. The Code applies not only to services intended to be used by children, but also those not aimed at children which are accessed "by a significant number of children" so the number of children accessing the service should be "material". In assessing this, the ICO recommends that services consider whether it is reasonable to conclude that the under-18s form a material group of people using the service.
The ICO suggests that what is a "significant" group in this context does not mean that a large number of children must be using the service, nor that children form a substantial proportion of users. It means there are more than a de minimis or insignificant number of children using the service. The low threshold depends on a variety of factors relating to the type of service, how it has been designed, and the personal data processing risks that it presents to children. What is significant may vary based on:
- the number of people using the service
- the number of the users likely to be children
- the data processing risks the service poses to children.
The guidance sets out a non-exhaustive list of factors which can be taken into account when assessing whether children are likely to access a service. The different features of the service and whether children are likely to access all or parts of it should be taken into account and the decision-making must be justifiable. Assessments should be risk-based and carried out in a proportionate way. The ICO's suggested list covers consideration of:
- whether children can access the service – or are there systems or processes in place to prevent this? If age-gating is used, it must be robust and effective and not an extension of an adult site (ie one which allows access to other parts of an adult side). Self-declaration is unlikely to be sufficient
- the number of child users of the service and the proportion of total UK users or children this represents
- any available research evidence
- information relating to advertising targeted at children or likely to appeal to them on the service
- information on complaints received about children accessing or using the service
- types of content, design features or activities which are appealing to children
- whether children are known to like and access similar services
- any relevant evidence relating to the business or operating model
- the way the service is marketed, described and promoted.
The ICO stresses that a Data Protection Impact Assessment (DPIA) must be carried out where services are being offered to children in order to assess risk and, where appropriate mitigate it. But ISS providers must assess whether children are likely to access their service even if it is intended to be adult-only. This is part of the accountability requirement. Even where the DPIA concludes children are not likely to access the service, this should be recorded, justifiable and regularly reviewed.
Online Safety Act
The OSA places obligations on providers of online user-to-user and search services (or parts of them) likely to be accessed by children. Broadly speaking these duties involve conducting risk assessments, protecting children from content likely to be harmful to them as well as from illegal content, and transparency, reporting and redress duties.
Under the OSA, services are likely to be accessed by children (under-18s) where:
- (a) it is possible for children to access the service or a part of it
- (b) there is a significant number of children who are users of the service/part of the service, or the service/part of the service is of a kind likely to attract a significant number of users who are children ((b) being the child user condition) (section 37).
In order to determine whether or not children are able to access the service and whether the child user condition is met, potentially in-scope services are required to conduct a children's access assessment. The assessment should be based on the actual number of users of a service (as opposed to its intended users) and should be regularly repeated in addition to being repeated where certain 'trigger' events occur.
Child-specific duties under the OSA relate only to the parts of a service possible for children to access. A provider is only entitled to conclude that children are not able to access the service/part if age verification/estimation (but not self-declaration of age) is used with the result that children are not normally able to access that service/part. Services that fail to carry out the access assessment will be considered likely to be accessed by children, and Ofcom (the OSA's regulator) can determine that a service is likely to be accessed by children in certain circumstances.
Note also that the duty to “prevent” children encountering primary priority content harmful to children requires providers to use age verification/estimation (unless Ts&Cs indicate such content is prohibited and the policy applies to all users). Even where not required, age verification and estimation are given as examples of what can be used to comply with all of the obligations imposed regarding protection of children.
We don’t know – yet – whether Ofcom will interpret "significant" in the same way as the ICO and the legislation itself does not provide a definition.
We'd, hope, at the least, that there won't be a conflict as the ICO and Ofcom are expected to work together as part of the Digital Regulatory Cooperation Forum precisely to ensure they do not take conflicting approaches to areas where their roles overlap. We are likely to find out more when Ofcom publishes its next batch of draft guidance and codes of practice which will focus on child safety and will include draft guidance on carrying out children's access assessments. This is expected in Spring 2024. The obligation to carry out the first children's access assessments will apply three months from publication of Ofcom's final guidance on them – likely to be in early 2025.
A holistic approach
There are definitely overlaps between the Children's Code and the OSA, whether or not the definitions and criteria for the Children's Code and OSA are the same in terms of what being accessible to children means, although not all organisations in scope of the Children's Code will be caught by the OSA.
There are, however, obviously considerable differences in terms of the obligations each engages. The Children's Code is a set of overarching principles to help ensure data protection standards and embed privacy by design in order to protect children's data. Complying with these principles may also help with compliance around children's protections under the OSA but will certainly not be enough.
Notably, the OSA has a stringent requirement to use age verification/estimation in relation to certain types of particularly harmful content, it requires certain terms to be stated in terms of service, and it takes a more explicitly granular approach to different types of content and different age groups (read more here). The ICO updated its 2021 guidance on age assurance in January 2024, partly to take this requirement into account and help organisations required to use age assurance under the OSA do so in a data protection-compliant way.
As noted above, until Ofcom publishes its draft guidance relating to children, it is not possible to do a full comparison. Having said that, those used to assessing whether their services are likely to be accessed by children and putting risk mitigation procedures in place, are going to be ahead of the game where they are also caught by the OSA, and potentially, by non-UK legislation dealing with similar issues like the Digital Services Act.