Digital regulation within the UK has advanced significantly in 2021 as policy makers attempt to address the issues caused by the inexorable growth of the online world. Concerns that children are at particular risk are now being met in the UK with policy and legislative proposals to ensure their safety online.
The Online Safety Bill (OSB), once in force, will impose a statutory duty of care on providers of certain user-to-user and search services to protect their users from harm. All organisations within scope have duties and obligations specific to child users to mitigate the risk of damage to them online.
The Department for Digital, Culture, Media and Sport (DCMS) recently published a collection of guidance to help businesses protect children on their online platforms. Although it is not specifically tied to any legislative obligations and is targeted at small and medium-sized enterprises, it provides high level guidance around data protection, age appropriate consent, and protecting children from child sexual exploitation and abuse (CSEA) and terrorism content.
Organisations in scope also need to comply with the Children's Code, a statutory code created by the Information Commissioner's Office (or ICO, the UK's data protection authority). The Children's Code, also known as the Age Appropriate Design Code, applies now and the ICO will begin enforcing it from 2 September 2021. The Code cannot be enforced alone, however, the ICO will rely upon it when considering possible breaches of the UK GDPR regarding children's data in the context of digital services.
Although they are devoted to addressing different aspects of child safety online, there are clear synergies between the broad obligations created by both the Children's Code and the OSB, indicating that the direction of travel, in the UK at least, favours an increasingly hands-on approach.
For both the OSB and the Children's Code, a child is a person under the age of 18.
Providers will be within scope for services which are "likely to be accessed by children". This is the same language used in the Children's Code – it covers not only services targeted at children, but also those likely to be accessed by them. In practice this is likely to be interpreted broadly and ICO guidance suggests that "likely" means "more probable than not" and a similar interpretation will likely apply in the context of the OSB.
There is a duty on providers to effectively carry out a scoping exercise to establish whether children use the service. This assessment should determine:
The child user condition will be met if there are a significant number of children who are users of the service, or the service is likely to attract a significant number of child users.
If the assessment determines that there are child users then this condition is clearly satisfied. Even if there are no child users identified at any given point, aspects of a service which may interest children could mean the result is the same.
There are a number of obligations specific to children for services within scope of the OSB (see here for our coverage of the obligations more generally).
In particular, there is an overarching duty to protect children's online safety. This mirrors the approach taken in the Children's Code which sets up the first of the 15 standards as an obligation to "always act in the best interests of the child". The overarching duty in the OSB is broken down into several sub-duties:
This distinguishes children in general from children in different age groups. ICO guidance on the Children's Code splits out different age groups up to the age of 17, and it seems likely that there is expectation that organisations should take a similar approach under the OSB. Types of harm, and the impacts of such harms, will clearly differ among children of different ages. For example, content of a sexual nature may be highly harmful to young children but less harmful to older teenagers.
"Primary priority content" (and "priority content") means content which is designated as such in secondary legislation. This means that there will be categories of content which should be treated as de facto harmful by service providers, though what these are is not yet clear.
There is also a qualifier of "proportionate". This means that while the OSB asks a lot of organisations within scope, it is unlikely that OSB's enforcer, Ofcom, will expect all organisations of every type to take every step available to them to prevent exposure of children to harmful content. It asks organisations to adopt a risk-based approach.
Service providers need to specify:
Alongside the Children's Code and its requirements for clarity and transparency with children about their personal data and privacy, it appears that the OSB will also require organisations to have terms of service which are comprehensible for children if children are likely to engage with their services. The OSB seeks to empower children to report and challenge content they encounter which is harmful, in order to prevent others encountering it too. Action they can take should be conveyed in plain English, in a child-friendly format.
This is defined as content which:
As with most categorisations in the OSB, the treatment of harmful content is broad. It is unclear what is meant by a child of "ordinary sensibilities", or what "significant adverse physical or psychological impact" may look like. However, the UK government has published Interim Codes of Practice concerning CSEA and terrorist content. The examples provided demonstrate clear risks to both the physical and psychological wellbeing of children, but borderline cases may prove difficult to assess.
Another key duty falling on to service providers will be to carry out children's risk assessments, to keep them up to date with Ofcom's designations of risk and carry out further risk assessments before making significant changes to design or operation of the service. Wrapped into this is the requirement to notify Ofcom of any non-designated content harmful to children which is identified during such assessment.
The children's risk assessment must identify, assess and understand:
It's clear that there will be no 'one size fits all' approach to this. The assessment should separately consider different risks to different age groups, how children interact with the service, and the types and severity of harm.
It is advisable for organisations to begin thinking about risks to children now. Assessment exercises are likely to be resource-intensive, especially for organisations which explicitly target children, or which know that, whether targeting or not, significant numbers of children use their services.
In particular, the OSB requires the assessment to consider how any algorithms may affect the harmful content encountered by children. The algorithms of large technology companies are often their greatest assets; the OSB will ask these companies to stop and consider how their algorithms impact children.
This guidance largely refers out to general data protection obligations under law and the Children's Code and the ancillary information provided by the ICO. It sets out the following summary:
This guidance sets out how businesses should design their services to ensure content is suitable for children using the services:
There is additional guidance for services used by under 13s.
This guidance sets out how businesses should establish a safe environment for children to interact with each other, including by direct messaging, voice or video calls, commenting, reacting, liking, tagging and playing games. It sets out the following summary:
In line with the Interim Code of Practice on CSEA, the government is particularly concerned about online CSEA. Businesses are advised to report all forms of CSEA as soon as possible to their local police force.
There is an irony that guidance and regulatory requirements which emphasise the importance of clarity and accessibility for child users are themselves complicated and challenging to assess. In most areas of regulation, it is clear who is subject to the regulator's oversight; that is not the case in the context of online harms. Many organisations may be apprehensive about preparing for the OSB and the Children's Code, but even if compliance is likely to take a long time, it is much better to have a clear understanding of the level of exposure and the work needed to manage it.
Organisations should first assess the applicability of the OSB and the Children's Code to their operations and once done need to put a plan in place to ensure they can achieve compliance and mitigate any unavoidable risks. If, following an assessment, you are still unsure whether your organisation falls under the scope of the OSB and the Children's Code it would be wise to err on the side of caution and assume that it does – the regulators are likely to assert their authority as widely as possible and it is better to be prepared to meet this than to seek to avoid it.
To discuss the issues raised in this article in more detail, please reach out to a member of our Technology, Media & Communications team.
We look at tensions between preventing online harm and protecting the fundamental rights of freedom of expression and privacy.
1 of 6 Insights
We look at Ofcom's extensive role in making the UK's incoming regime on online harms work.
2 of 6 Insights
We look at the role of risk assessments and risk profiles in determining the scope and application of the Online Safety Bill.
3 of 6 Insights
We look at who might be caught by the incoming Online Safety Bill.
5 of 6 Insights
What constitutes illegal and harmful content, what are the applicable safety duties and how services may be expected to comply?
6 of 6 Insights